ZFBot leaps forward with 127 Million searchable domains

I’ve praised ZFBot several times in the past.

The “killer app” for domainers is a tool that processes daily the zone files of millions of domain names in various TLDs.

Perusing this huge, searchable database of domain names offers a great advantage in determining not only the trends in domain registration and ownership but also use it in order to gauge value and pricing of domains.

Ken Greenwood, creator of ZFBot has just launched a sizable upgrade to the tool: more than 127 million domains are now monitored, across several TLDs.

The latest upgrade adds the .info zone files, thus providing accessibility to .com, .net, .org, .info, .biz, .us, .mobi and .tel.

ZFBot.com is an indispensable, free tool that can assist domainers with their domain investment and end-user sales.

Comments

  1. They have some serious issues with technology there. Truro search using domain ends with at diffrent times
    of the day. It does not find anything, the application just hangs. I like the idea, is just that they need to improve technology. I would suggest .net as front end with SQL server db as it is better for large amounts of data. Er can always help them if they want.

  2. how much money costs to get a zone file ?

  3. Domainsheat.com – I haven’t witnessed those “issues with technology” that you’re quoting, except for minimal downtime around 2am for the daily update of the zone files.

    Chandan – Zone files are free; one has to apply for and be approved, however. There are certain limitations in place.

  4. Acro, can you comment more on how one can be approved? Thanks.

  5. Anthony – I haven’t gone through the process; one has to apply with every TLD Registry. There are such links at the web site of e.g. Verisign.

  6. @Domainsheat.com – “They” is me and what do you want for free? Unfortunately, if you happen to live in another country/time zone and are searching between midnight and 6AM EST, then you’re hosed. Because that’s when the data is being refreshed. If this were a paid site then I certainly would have to have redundant database tables ensuring zero downtime but I’m not going to fork over another couple hundred bucks a month for a server of that size. The database, with all of the necessary indexes, is already 60 gig – mirroring it would push me well over 100 gig (which is tiny in relative terms – I work with Oracle Databases that are many, may terabytes) and any decent virtual server is a few hundred bucks a month for that size. Which I simply and not going to spend for a hobby of mine. I’ll make sure I make it clearer when the data is being refreshed (like right now – the .info zone file is being loaded because that zone file isn’t created at the same time as the others…didn’t know that).

  7. And the performance isn’t an issue when the database isn’t being updated. SQL server? Yikes… I’d prefer to use a version of Oracle that I wouldn’t run into licensing issues with… it’ll run circles around SQL server. I’m only using MySQL because it’s adequate and free.

Speak Your Mind

*