14

I need to see if a site I am testing is vulnerable to any of the multiple Google dorks that are available at sites like this and this. Traditionally, one uses a 'dork' by searching "Index of/"+c99.php"" in Google and getting a whole stack of results.

How can I search for all the dorks for my specific site quickly and easily? Is it even possible?

Edit: For clarification, I have access to, and use a number of commercial paid applications to do web-app scanning. However, i'm having issues with the site using a WAF and limiting my connections which is greatly slowing things down. As a result, I'm more looking for a program that queries Google rather than scanning the site.

NULLZ
  • 11,426
  • 17
  • 77
  • 111

7 Answers7

7

Go to http://www.exploit-db.com/google-dorks/ . It is huge repository of Google Dorks . Add site:mysite.com at the end of each relevant dork as mentioned by Brain Adkins and voila you are good to go . A simple python/bash/ruby script to scrape dorks from that site ( for python http scraping i love beautiful soup ) , append site:mysite.com at the end of relevant ones and its done .

Remember many dorks will not be relevant to your specific website .

oldnoob
  • 300
  • 1
  • 3
4

Goolag Scanner by CDC works for what I need and I can update the XML files with newer dorks by editing the XML file in the file %baseDir%\Dorkdata\gdorks.xml

NULLZ
  • 11,426
  • 17
  • 77
  • 111
3

I did the same thing in Java but parsed a list of known vulnerable banners into a text file then port scanned every known port and grabbed the banner. Returned a list of open ports and then checked if any of the open ports had a vulnerable banner. Same exact thing. See below. Sorry I don't have time to write a program for you as I'm just a computer science student that is busy and happened to come upon this.

from pygoogle import pygoogle
g = pygoogle("" + str(variable from file you created below)")
g.pages = 5
print '*Found %s results*'%(g.get_result_count())
g.get_urls()

Use python. Download each html file from url iterating each exploit by it's number

http://www.exploit-db.com/ghdb + "/" + str(x) + "/"

open each html file building an array of the parsed string for a google query close last html file create and open new file write each string of each index in array to a new line while next index = true

EDIT>> You are trying to get the "THIS" in contents of <h2>Google Search:<a href=..>THIS</h2>

2

Dorks are not an official thing recognized by google... Just pre-canned searches for specific pages that might contain vulnerabilities.

You can search your site for a specific page by appending "site:mysite.com", but that won't do all of them.

A better answer to your goal would be to use a known vulnerability scanning service such as mcafeesecure. They will scan your site and give you a list of known vulnerabilities.

Scanning services maintain their own internal lists of vulnerabilities and crawl your site searching for them. These services are both quick and easy per your requirements... but not free.

Also google "PCI scanning service"... There do appear to be some free/cheap ones, but buyer beware.

Brian Adkins
  • 1,817
  • 1
  • 17
  • 14
1

I don't know of an existing tool that does what you are talking about. You could probably create a tool that does this fairly easily by compiling a list of google dorks and writing an app that fires a request for each one at your target site. I've seen several google dork collections online that you may be able to use as a source...

Abe Miessler
  • 8,155
  • 10
  • 44
  • 72
0

You can run your own vulnerability assessment against your site, given you own the site and get written permission from your hosting entity. Check out the tool W3AF (http://w3af.org/). It's a GUI tool that will run simple tests for SQL injection, XSS, and other types of attacks.

PTW-105
  • 1,377
  • 9
  • 7
0

You can create your own script for finding google dorks vulnerabilities as already stated above. But, I would recommend using some well known tools for this purpose as your script may not be covering every aspect of scanning. And all vulnerabilities present in your site can't be exposed by using only google dorks.
If the site has high monetary value, then you should consider using some paid tools like Burp suite or IBM Appscan as per your cost margin.
If the site is of little value like a home project or personal blog, then you should try free tools like Vega, W3AF tool. All these tools provide you enough info about the bug present in your site and the fix, which you can manually check and understand.