Where To Test The File
There are numerous places you can test your robots.txt file, but Google Webmaster Tools is as good a place as any, although it only provides results for Google user-agents.
It's important to run a test to check that the file isn't accidentally blocking Googlebot from any files or directories on your site. This is what to do:
- On the Webmaster Tools Home page, click the website you want.
- Go to Crawl, then click Blocked URLs.
- If it's not already selected, click theTest robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, enter the name of the site to be tested against.
- In the User-agents list, select the user-agents you want.
The results are not saved here into your robots.txt file, so any changes you make will not be saved. If you wish to save them, copy the contents then paste them into your robots.txt file.
Generate a robots.txt file is here.
You should now be fully organized with a useful and effective robots.txt file, and whenever the need arises for site updates or changes, you'll know exactly what to do!
Thanks for joining me on this robotic tour!