On Tuesday, May 27th, 2014, Google released a new tool in its webmaster tools arsenal. This new “fetch and render” tool allows webmasters, owners, and web developers to get a visual representation of what their site looks like through the eyes Google.
This tool is helpful, because more and more websites are using dynamic scripts and rendering code on the client side which means that websites use your browser and computer to render some of the site’s information, and not the servers on the internet. So this content is not fully assembled and presented, if you will, until your browser displays it. It is imperative that Google is able to “see” the final rendered website, and they can no longer just use the basic HTML code of websites to get the full picture.
It has been common practice to block certain elements from being crawled by the search engines by using commands in a robots.txt file on the server (the file that tells search engine bots not to go to certain files or directories) to save on bandwidth and resources that are consumed by these search bots. As technology and computing powers evolve and grow, Google has made changes to their ability to efficiently render pages and spider them beyond the basic HTML text, and are now rendering full pages including scripts and CSS, so it is no longer necessary to block these elements.
So why should I care? As a business owner, you need to know if you are giving Google everything it needs to be able to include your site in its index, and if your site is optimized for the best rankings and performance in the organic search results. If Google can’t render your site, it’s more than likely not going to give it the necessary relevance to any keywords. Remember Flash? Sites built in Flash dropped out of Google’s index because the search bots weren’t able to render Flash and couldn’t “see” what was on those sites. It is always a good idea to know if your website is visible to the search engines if you are counting on business leads, sales and traffic from consumers online.
So how can I do this? First you will need a Google Webmaster Tools account, if you do not have that, you should create one as soon as possible. On the left hand side, under the crawl option, you will see the “Fetch as Google” option.
After you click on that, you will be given the option of fetching the old way, or the new Fetch and Render option.
There you can put in a specific URL from your site, or leave it blank in order to fetch the homepage. You can also have Google render both desktop and mobile versions of your site. Once you select Fetch and Render, Google will go to your site and spider it according to how you have your robots.txt file set up.
Once it is done processing, the tool will show you both the normal “fetching” and now the new “rendering” with a list of resources that were blocked by robots.txt when trying to render the page.
The new feature shows what your site looks like rendered the way Google sees it.
Additionally, you can see how Google would see your site on a mobile device.
Not only does Google now allow you to see what it sees, it also gives you a list of the resources that it couldn’t use to render your page for both desktop and mobile versions.
As the Google Webmaster Central blog suggests, all of this information is now available to you in order to help you make your site more friendly to the search engines by not blocking essential items needed to render the page. It is also a good tool for webmasters and developers to use to double check your website, making sure it renders correctly to the search engine, and that you are not unintentionally blocking a resource with your robots.txt.
So, now it’s time for you to open up your coding application, or give your web developer a call and get your site into compliance with what Google wants, because in the long run, it can only benefit your site and your business if you follow the suggestions that Google sets.