Posted by Cameron Francis on 31 Jul , 2015 in News Uncategorized
Many webmasters using WordPress and other popular Content Management Systems that include Joomla are worried about a warning from Google. The tech Giant is currently sending out mass notifications to webmasters via its Search Console that Googlebot is unable to access their JavaScript and CSS files.
The warning says that Google systems have recently detected an issue with various websites’ homepage. This issue is affecting how well Google algorithms can render and index the content on these websites.
If the bot is unable to access your JavaScript and CSS files because of restrictions in your robots.txt file, Google won’t be able to understand how your website works. Google needs access to these files in order to know whether your site is working properly, otherwise it can result in poor or suboptimal ranking.
The warning about suboptimal rankings because of blocked JavaScript and CSS files is not new.
Google has previously warned webmasters against blocking their CSS & JavaScript on several occasions:
The Fetch and Render webmaster tool also alerts you whenever you opt to block the Cascading Style Sheets and JavaScript.
Google currently renders the page in the same way a user would see it. If you block CSS or JavaScript, then you are likely to impact how these pages appear to users.
Google is now making sure their warning message gets across to webmasters and SEOs via email and the Search Console.
There is no need to be alarmed if you have received this Google Webmaster Tools notification. First, you are not alone. Many webmasters have been alerted that Googlebot cannot access or see their JavaScript or CSS files.
Secondly, this is not a penalty notification. It is simply a warning that if Google is unable to see your entire website, the consequence may be poorer rankings for your site.
The cause of the problem is simple: your CMS’ default setting is blocking the ‘include folder’ option, which contains the CSS and JavaScript files.
A recent Twitter forum discussion noted that many WordPress sites that received this notification are blocking their ‘include files’ by default.
While the repercussions for this default option may sound atrocious, there is quite an easy fix for the problem. You can save your site by implementing the fix in a simple manner.
If you want to address the issue on your own, use the Fetch and Render tool for an in-depth diagnosis of the issue.
The email notification comes with a list of guidelines on how you can fix this problem. That may be your first alternative. Simply follow the instructions provided in that email to diagnose the issue.
At the same time, you can pick the above advice to use the fetch and render tool within the Google Webmaster Tools to learn what Google is able or unable to see.
Editing your website’s robots.txt file
Assuming that you are at ease editing the robots.txt file, look for the following code lines:
These codes mean that your robots.txt file is blocking access to your CSS and JavaScript files.
Your option is to remove these codes, which are preventing Googlebot from rendering your site the way human users see it.
After removing the codes, the next step involves running your site through the Fetch and Render Google tool. This will help you ascertain whether the problem has been fixed.
By following the steps outlined above, you should be able to quickly unblock the CSS and JavaScript files in your include folder.
But, if after deleting the code, Googlebot is still unable to access your files, then you should seek the assistance of a web development company. Hiring a professional web developer may seem expensive at the onset, but it’s a great way of saving costs in the long run.
A good web development company will do more than solve the current problem. They can also help improve your online visibility and brand awareness among your target audience, which really is an excellent investment rather than an expense.