Google changes how it digests robots.txt files
August 16, 2007 at 08:35 AM
An improvement really.
It'll now properly read relative url designations. (e.g. /archives).
And there's a new directive
"sitemap: http://www.example.com/sitemap.xml"
So you can tell Google where to find your sitemap. That's nice.
Also - news of a new meta tag where you can tell Googlebot to ignore a page after a certain date and time.
Link: Official Google Webmaster Central Blog: New robots.txt feature and REP Meta Tags.
Comments