An update has been made to the technical guidelines section of Google’s Webmaster Guidelines that specifically states site owners should not block the Google ads destination URL with the robots.txt file. This information was first reported by Barry Schwartz of Search Engine Roundtable.
The new text reads:
Make efforts to ensure that a robots.txt file does not block a destination URL for a Google Ad product. Adding such a block can disable or disadvantage the Ad.
It’s interesting that this change is made just days after Matt Cutts announced a change to the page layout algorithm that penalizes pages with too many ads above the fold. This is just pure speculation on my part, but maybe Google is trying to be proactive in thwarting sites that try to block the presence of ads with their robots.txt file.
For example, if a webmaster was unhappy with this new algorithm change and wanted to keep the the top of their page full of ads, they made try to hide the ads from Google by blocking them in the robots.txt file so they can’t be crawled.
In doing so they may think they’re avoiding a penalty, but with this new Webmaster Guidelines change it appears they will be doing more harm than good since they will be effectively disabling the ad all together.
Or maybe I’m reading this all completely wrong. What do you think is the reason behind this recent change to Google’s Webmaster guidelines? I’d love to hear your thoughts!
Editor’s Note: This post has been amended to include the source of the first report. According to Search Engine Roundtable, these guidelines have now been removed.