7th Aug, 2015 / Sofiann McKerrellSofiann McKerrell


If you have recently received a Google warning in your emails saying “Googlebot cannot access your JavaScript and/or CSS files”, you’re not alone.

Google has sent out similar notifications to other webmasters via the Search Console, reminding them once again that not allowing Googlebot to access those files may result in “suboptimal rankings”.

Google’s stand against blocking JavaScript and CSS has been clear since it was written into the Webmaster Guidelines in Google Search Console last October. However, it was only recently that the company has sent mass notifications warning website administrators about it.

The full warning reads:

“Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.”

This sounds bad, but fortunately the email comes with instructions to diagnose the issue as well as to check if you have fixed the problem, using the Fetch and Render tool within the Search Console.

An easy fix would be to look through your site’s robots.txt file and remove any of the following lines of code:


If this doesn’t do the trick and Googlebot is still being blocked, the Fetch and Render tool will provide further instructions on edits to be made to your robots.txt file. The robots.txt testing tool is also useful for detecting any other crawling issues.

As Google has made its message loud and clear, it’s a good thing you know whether or not your site is blocking Googlebot from accessing some content so you can deal with the issue and quite possibly boost your search performance in the process.