Domo Arigato, Mr. Roboto

A robots.txt file provides restrictions to search engine robots (known as “bots”) that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file that prevent them from accessing certain pages exists.

If your website has files that you do not want search engines to index, then you should include them in a robots.txt file. This file must be found in the root of the domain. If a robots.txt file is located in a subdirectory folder it is not valid.

As a Search Engine Optimization Specialist, I created robots.txt files for my clients that are enrolled in DDA’s SureThing Optimization Program. This file can be created with any text editor. The simplest robots.txt file uses two rules:

User-Agent: the robot the following rule applies to
Disallow:
the pages you want to block