« Cloudmark crashes Outlook around the world | Main | Stuff »

Google changes how it digests robots.txt files

An improvement really.

It'll now properly read relative url designations. (e.g. /archives).

And there's a new directive
"sitemap: http://www.example.com/sitemap.xml"

So you can tell Google where to find your sitemap. That's nice.

Also - news of a new meta tag where you can tell Googlebot to ignore a page after a certain date and time.

Link: Official Google Webmaster Central Blog: New robots.txt feature and REP Meta Tags.

Comments

The comments to this entry are closed.