Crawler

A crawler is also known as Spider or Robot, it’s a search engine program that literally “crawls” the web, collecting data, following links, making copies of new and updated sites, and storing millions of URLs in the search engine’s Index. This allows the search engines to respond with faster and more relevant or timely listings.