When and How To Use the robots.txt File
If you want the search engines to index everything in your site, you actually don't need this file. However, you only need a robots.txt file if you want to exclude access to any content that you don't want search engines to index.
A basic understanding of what this file does is all you need at this stage, so we'll cover just a few of the more important scenarios that you may encounter or wish to put in place.
The Job of robots.txt
This important file controls access to the pages/posts on your website.
As the automated search engine robots crawl the web, and before they access your site, they first look to see whether arobots.txt file exists that prevents them from accessing certain pages.
If you do nothing to this file or just leave it alone, the bots will continue to your website and search everything without restriction. On most sites this is exactly what you want, but there are times when you may need to restrict access to some areas of the server.
If you notice problems with bad links, or blocked URLs that Google hasn't been able to crawl, go to your Google Webmaster Tools and look at Blocked URLs in the Crawl section to have a look!