Check a list of domains with the WHOIS command

0

1

I want to set up a batch file or cronjob of some sort, using the linux command line, to check the availability of domain names. I will provide the list of domains to check using another file. It must then only do 30 checks per hour, and only if the domain is available, it must add that domain name to a different file. So...

  1. Do I have to create a batch file?
  2. What command must go in the batch file, that will:
    1. move through an existing list of domains
    2. check if those domains are available,
    3. remove that domain from the original list, and
    4. add the available domain to the "available" list.
  3. Repeat this command 20 times every hour

I know exactly how to do this with PHP and cron, but I want to know know if there is a quick way to do this using the "whois" command? It's point (2) that I am stuck on.

coderama

Posted 2012-06-06T10:53:46.877

Reputation: 699

I wrote a script in PHP that does, but I am now looking for a quick way to do this via command line, using the "whois" command. – coderama – 2012-06-06T11:02:17.987

I will add more precise questions... – coderama – 2012-06-06T11:02:33.647

( i lost the php script.... ) – coderama – 2012-06-06T11:05:58.223

To start solve this issue, try reading this link: http://mattgemmell.com/2008/12/08/what-have-you-tried/

– Octávio Filipe Gonçalves – 2012-06-06T11:17:12.847

U know what.... just close my question please... or i'll delete it. Attitude seems to be more important than helping people on superuser nowadays... – coderama – 2012-06-06T11:21:55.047

1@RD: Sorry, like you can see in my profile, i help people a lot. But i think, you don't ask for help in a particular issue you have in your code. You want only someone write a software for you, with a list a specifications you mention in your question. I think you can investigate a little bit for solve your problem, and then, if you stuck in your code, ask for help. – Octávio Filipe Gonçalves – 2012-06-06T11:27:03.333

Answers

6

That's not too hard.

Check the list of domains domains.txt, and add the domain to available.txt once it's found (by looking at the return value of grep, which is stored in $?).

Then, remove the found domains from domains.txt with sed in-place editing.

#!/bin/bash

AVAILABLE=~/available.txt
DOMAINS=~/domains.txt

lockfile whois-script.lock

while read -r domain; do
  whois $domain | grep -qci "No match"
  if [ $? -ne 0 ]; then
    # found
    echo $domain >> $AVAILABLE
  fi
done < $DOMAINS

while read -r domain; do
  sed -i "/$domain/d" $DOMAINS
done < $AVAILABLE

rm -f whois-script.lock

Note: On BSD sed, you want to use the following command:

sed -i "" "/$domain/d" $DOMAINS

You can save this script and call it from your crontab. Enter

crontab -e

and then add a line like this:

*/2 *   *   *   *   /path/to/script.sh > /dev/null

This will run the script every two minutes (*/2). Make sure to adjust the paths to the domain before.

slhck

Posted 2012-06-06T10:53:46.877

Reputation: 182 472

Depending on how often the input file changes you may want to add a lock file to prevent two instances running at the same time to avoid a race condition between checking the domains and removing the entries from the input file. – Bram – 2012-06-06T11:48:00.693

Good point, especially with two minute checks. You think that makes more sense? (See my update) – slhck – 2012-06-06T11:52:31.110

I tought of it more as "an excercize for the reader" :) but your updated answer will probably do exactly what the OP requested. I forgot to upvote it before, fixed that now. – Bram – 2012-06-06T11:57:04.333

0

Why go through all the trouble? You can now purchase all the whois data you'd like. I'm not saying it'll be cheap (for example https://alldomainsdb.com/) but it'll save you a lot of hard work. But that is if you need huge amount of whois data. If 30 checks per hour suffice, go ahead and make the batch file.

Dragson

Posted 2012-06-06T10:53:46.877

Reputation: 1