SERobot.txt, also known as Robot Exclusion Protocol (REP) is a text file created to command search engine robots how to navigate pages and index the pages found. It is therefore an important tool in SEO.

For the text file to work effectively, it should be in the top-level directory of a server for example; www.specimen.com/robots.txt

Untitled-Infographic-66

Applications of REPs

REP Tags

When REP tags are attached to an URI, they manage indexing and will query search engines. Contents of X-Robots tags can override incongruent directives present in META elements as REP tags can be delivered in META elements of X/HTML content and HTTP headers ofweb objects.

Micro- Formats

Micro-formats override page settings for certain HTML elements. Although robots.txt lacks directive to index, it is possible to set directives to index clusters of URIs.

robots

 

Referred Image- www.baonc.com

Pattern matching

Two expressions are allowed by Bing and Google to be usedto identify pages or sub folders that a search engine optimization does not want:

  1. * isa wildcard representing sequence of characters
  2. $ is used to match the URL endings

Robot.txt is Public

It is publiclyavailable and anyone can see what sections have been blocked. Therefore if an SEO company in Chicago has private information that they donot want shared, they should use a more secure approach.

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed