So let's presume you have managed to successfully get Googlebot to fetch your page. Awesome. Now you have a new tool to use - you can select Fetch and Render to ask Googlebot to not only grab your page, but also to render in html what it is actually seeing from its crawling efforts. Once it has completed (fully or partially) fetching your website you can click on the result to have Googlebot render an image of what it fetched.

If all is well, Google will show you a picture of your website essentially as you would see it on your own computer - and it will also show a list of errors or other problems it encountered along the way. Hopefully your screen won't look like this (forgive the screen block to protect the domains of the... well not so innocent):

If you see a screen like this, and the rendered html image of your website looks messed up (or worse, nothing at all like your actual site), then Googlebot has given you not only an indication of what is wrong with how it is perceiving your site, but also it (see red box) is giving you heads up as to where to look for answers as to why it isn't seeing your site the way you want Googlebot to. Fixing the problems with Googlebot becomes a matter of troubleshooting the reason codes within the Google Fetch and Render function.



Join the Discussion
Write something…
Recent messages
mijareze Premium
I eould like to have seen what to do if you cannot crawl and fetch because your site has blocked the bots. My site has not been verified and is blocked due to Javascript and css I believe. I've had a heck of a time with this situation but I am not very tech saavy. Any help will be much appreciated!
Reply
bjdluna Premium Plus
What can you even do with a situation like that? Are you certain it is blocked by Javascript and CSS? There's got to be a way to fix that.
Reply
mijareze Premium
I added filters to my websites to filter out the spam bots in Analytics. It worked great but now that several months have passed I noticed they are back. There is a website that offers the free service. You have to update by blocking out the new url's belonging to the spam bots.
Ed
Reply
TopAchiever Premium
I would've said this is an incomplete tutorial. Surely you can suggest some solution for the Denied items?
Reply
Quasar_wpg Premium
Now I just need to figure out how to fix the things it doesn't see. I have some scripts and images it says robots.txt denied. Do these things matter if the rendered page looks alright?
Reply
onefineham Premium
That would depend entirely if it is important (to you) that Google sees and/or indexes those items which are missing or blocked.
Reply
Quasar_wpg Premium
I disabled my social plugins and most of the script errors went away, all but 2. I guess the extra button and widget scripts don't matter too much to rankings? The two that were left related to JQuery something.
Reply
DoubleTap Premium
Thank ya' Barry... now it's just a matter of understanding the codes and knowing what to do with 'em! (-;
Reply
This new tool looks promising. Google is improving day by day.
Reply
bjdluna Premium Plus
I just don't like the fact that you kind of have to guess at what's wrong and try something then that doesn't work, etc. No wonder I'm scared! I'm scaring myself!
Barbara
Reply
Top