1

We got Nginx webserver. And sitemaps that we generate every week or so ...

We migrated to multiple web-servers under single load-balancer lately, and keeping a sitemaps on every webserver seem kinda silly. As we are on AWS, is there a way to store sitemaps on, lets say, S3 and somehow redirect sitemaps to S3 link ? But to keep the full domain name with 'www'. I do not see the way to doing it with domain names, maybe with some NGINX rule ?

I did not find many resources on this. How do you solve this ?

Katafalkas
  • 523
  • 2
  • 8
  • 20

1 Answers1

0

Yes, you just need to "prove" ownership of the domain by editing your robots.txt. You add the following to your robots.txt:

Sitemap: http://s3hostname.domain.com/sitemap.xml

You can use any URL you want in the robots.txt file, it doesn't need to be the same URL as the site itself. This is documented in the Sitemaps & Cross Submits section on the sitemaps.org protocol documentation