Breaking apart large zone files.

Brian F. brian at cv.net
Tue Nov 16 13:56:11 UTC 2004


On one of our DNS servers we currently we have a zone file with
2,195,387 entries. As you can imagine, this takes forever to reload,
and really hammers the server resources during a reload.

But i notice on other servers that don't have the large zone, but many
small zones adding up to much more than the one large one. It doesn't
eat up the resources or take as long to reload all the zones.

Could this be fixed by way of breaking apart the large zone into
several smaller files? What could be done to break apart a large zone
file that is for a single domain?

Would it make sense to break it apart as such:

db.big.zone.net -- 2,195,387 entries
broken into
db.1.big.zone.net -- 250,000 entries
db.2.big.zone.net -- 250,000 entries
so on and so forth...

then make a db.big.zone.net with something like this:

$ORIGIN big.zone.net
$INCLUDE db.1.big.zone.net
$INCLUDE db.2.big.zone.net
$INCLUDE db.3.big.zone.net

Or am i way off in what i think include can do?
I know this would be bad practice for a zone that needs maintenance
just based on the fact that you'd have to search many files for your
entry to make changes or deletes. But this big zone is script
generated twice a day, and never needs any changes by hand. So
changing the scripts to create several files could be easily done.

Does it make sense to do this, or is there a better way?
Should i just leave the zone alone and live with the pain?

Brian



More information about the bind-users mailing list