We're currently being crawled at a greater rate than we can handle.
I can't seem to get nginx blocking the googlebot
server {
location /ajax/sse.php {
if ($http_user_agent ~* "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" ) {
return 403;
}
}
}
We've had to resort to blocking it in the php script -
if ($_SERVER['HTTP_USER_AGENT'] == 'Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)') {
header('HTTP/1.0 403 Forbidden');
exit();
}
What's wrong with my nginx config?