0

I have a PHP script that calls the bash dig command through the exec function on Ubuntu to monitor hundreds of domains.

What the script does is to call dig to fetch some DNS records on each domain for analysis. This script is run by a cron job every 10 minutes.

I wonder, if issuing hundreds of DNS queries within a short period of time might run into some rate limit issue and make it fail ultimately.

If such DNS query traffic is fine for now, I wonder what traffic would get me into any sort of DNS rate-limiting issue (if any), since the number of domains I am managing is growing.

Thanks!

shoorlyne
  • 55
  • 4
  • 1
    "I have a PHP script that calls the bash dig command through the exec function". You shouldn't do that. Use DNS libraries in PHP and do everything from inside your program, shelling out to the dig command will only get you drawbacks. And yes you may get rate limited, this all depends on which nameserver you query. – Patrick Mevzek Mar 08 '20 at 04:12
  • I'm voting to close this question as off-topic because it is not really about administrating a server. – Patrick Mevzek Mar 08 '20 at 04:13

1 Answers1

0

Unless you are specifically going to the Nameserver for the DNS records I believe the dig command will only hit the cached DNS records (unless the TTL has already expired) The provider of the cache might have some kind of rate limiting in place as this kind of request can be used as an attack vector.

You could potentially check the TTL of the record for each site and hold off requesting that record again until the TTL has expired. This could potentially reduce the number of requests you are making by quite a bit. Of course this depends on the TTL of the records that you are dealing with, however most websites are around 3 or more hours.

Edit: I also found this question asked previously about rate limiting DNS requests: Are there typically rate limits to querying DNS?