Category Archives: Sitemaps
All the major search engines (Google, Yahoo!, MSN/Live, and Ask) use the XML Sitemaps protocol for getting URLs from websites. Of course they all still use good old-fashioned crawling, but the XML sitemap can be helpful for getting new content indexed quicker and also helping spot errors using other tools the search engines offer. Simply […]
Most SEO’s will advise you to buy an existing site/domain (lots of age benefit), but there are times when you need to start from scratch with a fresh domain. It can sometimes take a couple of weeks to get a new domain indexed by Google (even longer to start ranking!). In order to speed up […]
Clickability has created a really nice Robots.txt Builder that helps you to configure your Robots.txt file. You can easily build a Robots.txt file to disallow robots into your file structure. There are options for easily adding web search robots, image search, contextual ads, web archivers, and even “bad robots”. The bad robots puts in a […]
While others have spoken against submitting your sitemap to Google et al., I am going to stand up for sitemaps all across the web, and give you several good reasons for having a sitemap (now you can do sitemap autodiscovery instead of submitting).
At Search Engine Strategies New York it was announced that you can now have your sitemap automatically discovered by configuring it in your Robots.txt file. It is simple and easy to do, you’ll just need to know the URL or web address of your sitemap.