Google Makes Change To Webmaster Guidelines: Don’t Block Ads With Robots.txt

Google Makes Change To Webmaster Guidelines: Don’t Block Ads With Robots.txt

An update has been made to the technical guidelines section of Google’s Webmaster Guidelines that specifically states site owners should not block the Google ads destination URL with the robots.txt file. This information was first reported by Barry Schwartz of Search Engine Roundtable.
The new text reads:
Make efforts to ensure that a robots.txt file does not block a destination URL for a Google Ad product. Adding such a block can disable or disadvantage the Ad.
It’s interesting that this change is made just days after Matt Cutts announced a change to the page layout algorithm that penalizes pages with too many ads above the fold. This is just pure speculation on my part, but maybe Google is trying to be proactive in thwarting sites that try to block the presence of ads with their robots.txt file.
It’s interesting that this change is made just days after Matt Cutts announced a change to the page layout algorithm that penalizes pages with too many ads above the fold. This is just pure speculation on my part, but maybe Google is trying to be proactive in thwarting sites that try to block the presence of ads with their robots.txt file.
accessdenied main Full Google Makes Change To Webmaster Guidelines: Dont Block Ads With Robots.txtFor example, if a webmaster was unhappy with this new algorithm change and wanted to keep the the top of their page full of ads, they made try to hide the ads from Google by blocking them in the robots.txt file so they can’t be crawled.
In doing so they may think they’re avoiding a penalty, but with this new Webmaster Guidelines change it appears they will be doing more harm than good since they will be effectively disabling the ad all together.
Or maybe I’m reading this all completely wrong. What do you think is the reason behind this recent change to Google’s Webmaster guidelines? I’d love to hear your thoughts!