Let’s see some simple examples:

Or, you can block any URLs, that start with a given string (path):

The “Allow” directive is not part of the original standard, but it is now supported by all major search engines. You can use it to specify exceptions to a disallow rule:

The aforementioned wildcard operator is supported by all major search engines and can be efficiently used to block pages when part of the given path is unknown or is a dynamic variable:

Also, if your site has any sitemaps, it’s always a good idea to include the appropriate sitemap directive(s) in the robots.txt file to specify the location of your sitemap(s):

Once you have created or edited your robots.txt file using any regular text editor, obviously, you’ll have to upload it into the root directory of your website.



Join the Discussion
Write something…
Recent messages
TommyVTE Premium
great training need to sit down for this to see how and what for my site, thanks
Reply
smartketeer Premium
Thanks Tommy!
Reply
suzzziq Premium
This is totally Greek to me!!! I get the basic premise, but unsure how to implement. I'm flagging it for future reference, in case I ever get brave enough to try this! Thanks so much for the training:)
Blessings:)
Suzi
Reply
smartketeer Premium
Thanks for your time and your feedback Suzi!
Reply
FKelso Premium
Gee, where did you learn so much stuff?

Guess I have to go first and see if I have a robots.txt file.
Reply
smartketeer Premium
You have Fran ...

The question: what it contains?

Gee ... That's a LOOOOOOOOOONG story :)
Reply
FKelso Premium
You always give me a chuckle. Thanks.
Reply
lesabre Premium
Thanks again, got to save this and come back to it. Lot of information that can be very helpful to me. Got to answer all those e-mails first. All the best.
Reply
smartketeer Premium
Thanks Michael!

All the best!
Reply
dowj01 Premium
Your training certainly helps make a subject which as a newbie seemed beyond me, very clear. Thank you.
Justin
Reply
smartketeer Premium
Thanks Justin!
Reply
Top