What is Web Robot and Robots.txt?

Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google, Bing, etc use them to index the web content, spammers use them to scan for email addresses, and they have many other uses.

Web site owners use the robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion

0 Response to "What is Web Robot and Robots.txt?"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel