Term: "Robots.txt"

Also known as robots exclusion protocol, Robots.txt is a text file stored in a site's root directory that tells a search engine Crawler which site pages and sub-folders should not be included in the search engine Index. However, there is no guarantee that a Crawler will comply with this request. Robots.txt is an alternative to a Meta Robots Tag or password protection.