I just updated my robots.txt file on a new site; Google Webmaster Tools reports it read my robots.txt 10 minutes before my last update.
Is there any way I can encourage Google to re-read my robots.txt as soon as possible?
UPDATE: Under Site Configuration | Crawler Access | Test robots.txt:
Home Page Access shows:
Googlebot is blocked from http://my.example.com/
FYI: The robots.txt that Google last read looks like this:
User-agent: *
Allow: /<a page>
Allow: /<a folder>
Disallow: /
Have I shot myself in the foot, or will it eventually read: http:///robots.txt (as it did the last time it read it)?
Any ideas on what I need to do?