Google announced on their Webmaster Central blog today that they have recently updated their Fetch As Google tool, which now gives users the ability to render a page exactly how Googlebot sees it.
How It Works
Before using the Fetch as Google, you’ll need to have added and verified your site in Webmaster Tools. Then, follow these instructions:
- On the Webmaster Tools Home page, click the site you want.
- On the Dashboard, under Crawl, click Fetch as Google.
- In the text box, type the path to the page you want to check.
- In the dropdown list, select the type of fetch you want. To see what our web crawler Googlebot sees, select Web. To see what our mobile crawler for smartphones sees, select Mobile Smartphone. To see what our mobile crawler for feature phones sees, select Mobile cHTML (this is used mainly for Japanese web sites) or Mobile XHTML/WML.
- Click Fetch for having Googlebot fetch the path you entered, or click Fetch and Render to have Googlebot both fetch the path and render it as webpage.
You can use this tool to fetch up to 500 URLs a week per Webmaster Tools account. When rendering a page, Googlebot will try fetch all the external files as well. Such as images, CSS and JavaScript files. These files are then used to render a preview image that allows you to see your page as Googlebot sees it.
Practical Uses Of This Tool
Google suggests that this is useful to diagnose a page’s poor performance in search results, because you will be able to diagnose crawling errors. If Google is not able to render the page as you intend for Googlebot to see it, then that could have a negative effect on your ranking in search results.
Google also suggests that this new feature is useful for identifying problematic pages in the event that your site has been hacked. For example, if your site is appearing in search results for popular spam terms when those terms don’t exist in your source code, then you can use the Fetch as Google tool to understand exactly what Google is seeing on your site.
Something like the above example can happen when the security of your site is compromised by a hacker. Hackers can disguise the content of your site so that it doesn’t appear to normal users, only to Googlebot. Since the content appears normal to everyone else, the problem is difficult to diagnose without the Fetch as Google tool.
Fetch As Google will not render anything being blocked by robots.txt. If you are disallowing the crawling of some of your files, then Google won’t be able to show them to you in the rendered view. For more information, see this support article.