New Googlebot cannot access CSS and JS files in Google Search Console
Hello,
In July, this article has been published:
http://www.thesempost.com/new-google-search-consol...
After publishing this info is added;
Added: It looks like many are getting warnings from having “Disallow: /wp-content/plugins/” which can be blocked with some WordPress setups. If you use Yoast SEO, you can find the robots.txt in its new location here: SEO / Tools / Files.
But as we know it is the best-practice to use Disallow: /wp-content/plugins/ in the robots.txt
After some research a simple solution is provided, you just add the following to your robots.txt and you leave the "Disallow Plugins as is"
User-Agent: Googlebot
Allow: /*.js*
Allow: /*.css*
You will see in your search console that all errors concerning CSS and JS will disappear one by one (depending on how fast Google crawls your pages).
But an immediate test can be done with the robots.txt tester in the seach console where you will see that there are no Errors or Warnings...
I hope this helps...
Grtz,
Bert
Recent Comments
14
See more comments
Hi Bert, to which line should I add this? Just after all the existing infos?
I send you mine with a PM...
Thanks!