My robots.txt File Suddenly Restricts All Indexing

closeThis post was published 12 years 10 months 18 days ago. A number of changes have been made to the site since then, so please contact me if anything is broken or seems wrong.

Apparently, the Blogger team (or a hacker [?]) has changed the robots.txt file on this blog to disallow crawling from any bot that obeys the robots exclusion standard. I have no idea why, as other blogs on Blogger (even others I run) continue to have perfectly fine exclusion rules. Oddly, the “Sitemap:” line has been removed, and a “Disallow: /” line has been added, which, as mentioned above, blocks any and all compliant robots (spiders, bots, whatever you refer to them as) from accessing the site. Despite emailing Blogger support late last night (when I discovered the problem) and posting on the Blogger Help Group, I have received no helpful response from anyone.

Meanwhile, we’ll see if this post is even picked up by FeedBurner, Technorati, or Google BlogSearch. As far as I know, all three services obey the robots exclusion standard…

Update: Pinging FeedBurner wasn’t a problem. New post picked up in seconds.


I am an avid technology and software user, in addition to being reasonably well-versed in CSS, JavaScript, HTML, PHP, Python, and (though it still scares me) Perl. Aside from my technological tendencies, I am also a theatre technician, sound designer, violinist, singer, and actor.

Leave a Reply

Your email address will not be published. Required fields are marked *

Notify me of followup comments via e-mail (or subscribe without commenting)

Comments are subject to moderation, and are licensed for display in perpetuity once posted. Learn more.