How to Configure Sitemap Autodiscovery in Robots.txt

Twitter
Facebook
Pinterest
LinkedIn

At Search Engine Strategies New York it was announced that you can now have your sitemap automatically discovered by configuring it in your Robots.txt file. It is simple and easy to do, you’ll just need to know the URL or web address of your sitemap.

First, open your Robots.txt file on your server for editing. Then you will need to add the following line to the end of the file (it can be anywhere, but the end is probably a good place).

Sitemap: http://www.mydomain.com/sitemap.xml

Save the Robots.txt file with the new line for the sitemap URL. There you go! Your whole file may look something like this:

User-agent: *
Disallow: /somefolder/
Disallow: /somethingelse/
Sitemap: https://www.soloseo.com/sitemap.php

Search engines already come to your Robots.txt file when they visit your domain, so on their next crawl they will automatically find your sitemap file.

If you have a new site/domain you will probably still want to submit the sitemap URL to the search engines. To submit you can either submit the URL through their interfaces or use a ping.

Submit Sitemap to Google or Ping Google with your Sitemap
Submit Sitemap to Yahoo or Ping Yahoo with your Sitemap
Submit Sitemap to MSN Live.com
Info for Ask.com Sitemaps or Ping Ask.com by hitting this address: http://submissions.ask.com/ping?sitemap=http://www.yourdomain.com/sitemap.xml

More to Share...