Where To Test The File

There are numerous places you can test your robots.txt file, but Google Webmaster Tools is as good a place as any, although it only provides results for Google user-agents.

It's important to run a test to check that the file isn't accidentally blocking Googlebot from any files or directories on your site. This is what to do:

  1. On the Webmaster Tools Home page, click the website you want.
  2. Go to Crawl, then click Blocked URLs.
  3. If it's not already selected, click theTest robots.txt tab.
  4. Copy the content of your robots.txt file, and paste it into the first box.
  5. In the URLs box, enter the name of the site to be tested against.
  6. In the User-agents list, select the user-agents you want.

The results are not saved here into your robots.txt file, so any changes you make will not be saved. If you wish to save them, copy the contents then paste them into your robots.txt file.

Generate a robots.txt file is here.

You should now be fully organized with a useful and effective robots.txt file, and whenever the need arises for site updates or changes, you'll know exactly what to do!

Thanks for joining me on this robotic tour!



Join the Discussion
Write something…
Recent messages
acoolmil Premium
Impressive Rob.
Reply
Trialynn Premium
You did a great job Rob, thanks!
Reply
cybridge Premium
Nice JOB!
Reply
Bill67 Premium
thanks for the heads up Rob. Nicely done.
Reply
KD6PAO Premium
Nice job Rob! After what I went through the other day this makes perfect sense to me!
Reply
rob3 Premium
Thanks, thought I'd elaborate to cover wider scenarios!
Reply
Top