The ethicality of web crawlers

Yang Sun, Isaac G. Councill, C. Lee Giles

Research output: Chapter in Book/Report/Conference proceedingConference contribution

18 Scopus citations

Abstract

Search engines largely rely on web crawlers to collect information from the web. This has led to an enormous amount of web traffic generated by crawlers alone. To minimize negative aspects of this traffic on websites, the behaviors of crawlers may be regulated at an individual web server by implementing the Robots Exclusion Protocol in a file called "robots.txt". Although not an official standard, the Robots Exclusion Protocol has been adopted to a greater or lesser extent by nearly all commercial search engines and popular crawlers. As many web site administrators and policy makers have come to rely on the informal contract set forth by the Robots Exclusion Protocol, the degree to which web crawlers respect robots.txt policies has become an important issue of computer ethics. In this research, we investigate and define rules to measure crawler ethics, referring to the extent to which web crawlers respect the regulations set forth in robots.txt configuration files. We test the behaviors of web crawlers in terms of ethics by deploying a crawler honeypot: a set of websites where each site is configured with a distinct regulation specification using the Robots Exclusion Protocol in order to capture specific behaviors of web crawlers. We propose a vector space model to represent crawler behavior and a set of models to measure the ethics of web crawlers based on their behaviors. The results show that ethicality scores vary significantly among crawlers. Most commercial web crawlers receive fairly low ethicality violation scores which means most of the crawlers' behaviors are ethical; however, many commercial crawlers still consistently violate or misinterpret certain robots.txt rules.

Original languageEnglish (US)
Title of host publication2010 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2010
Pages668-675
Number of pages8
DOIs
StatePublished - Dec 13 2010
Event2010 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2010 - Toronto, ON, Canada
Duration: Aug 31 2010Sep 3 2010

Publication series

NameProceedings - 2010 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2010
Volume1

Other

Other2010 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2010
CountryCanada
CityToronto, ON
Period8/31/109/3/10

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Networks and Communications
  • Software

Fingerprint Dive into the research topics of 'The ethicality of web crawlers'. Together they form a unique fingerprint.

  • Cite this

    Sun, Y., Councill, I. G., & Giles, C. L. (2010). The ethicality of web crawlers. In 2010 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2010 (pp. 668-675). [5616518] (Proceedings - 2010 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2010; Vol. 1). https://doi.org/10.1109/WI-IAT.2010.316