Monday 23 May 2011

What is Crawler?

A crawler is a software or a program that is designed by the search engines to visit Web sites or blogs and reads their entire pages and other information that is updated by the site owners, in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a  a "spider." Crawlers are typically programmed to visit sites. Entire sites or specific pages can be selected and visited and indexed. Crawlers apparently gained the name because they crawl through a site a page at a time, following the links to other pages on the site until all pages of all websites have been read. There are all search engines have own separate crawler to indexes the websites.

No comments:

Post a Comment