Automated evaluation of search engine performance via implicit user feedback

Himanshu Sharma, Bernard J. Jansen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

19 Citations (Scopus)

Abstract

Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics.

Original languageEnglish (US)
Title of host publicationSIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
Pages649-650
Number of pages2
DOIs
StatePublished - Dec 1 2005
Event28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2005 - Salvador, Brazil
Duration: Aug 15 2005Aug 19 2005

Publication series

NameSIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval

Other

Other28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2005
CountryBrazil
CitySalvador
Period8/15/058/19/05

Fingerprint

Search engines
Feedback
Information retrieval
Servers
Availability
Costs

All Science Journal Classification (ASJC) codes

  • Information Systems

Cite this

Sharma, H., & Jansen, B. J. (2005). Automated evaluation of search engine performance via implicit user feedback. In SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 649-650). (SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval). https://doi.org/10.1145/1076034.1076172
Sharma, Himanshu ; Jansen, Bernard J. / Automated evaluation of search engine performance via implicit user feedback. SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. 2005. pp. 649-650 (SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval).
@inproceedings{71f5d3784dd14b5a9b459cbc313e37d9,
title = "Automated evaluation of search engine performance via implicit user feedback",
abstract = "Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics.",
author = "Himanshu Sharma and Jansen, {Bernard J.}",
year = "2005",
month = "12",
day = "1",
doi = "10.1145/1076034.1076172",
language = "English (US)",
isbn = "1595930345",
series = "SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval",
pages = "649--650",
booktitle = "SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval",

}

Sharma, H & Jansen, BJ 2005, Automated evaluation of search engine performance via implicit user feedback. in SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 649-650, 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2005, Salvador, Brazil, 8/15/05. https://doi.org/10.1145/1076034.1076172

Automated evaluation of search engine performance via implicit user feedback. / Sharma, Himanshu; Jansen, Bernard J.

SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. 2005. p. 649-650 (SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Automated evaluation of search engine performance via implicit user feedback

AU - Sharma, Himanshu

AU - Jansen, Bernard J.

PY - 2005/12/1

Y1 - 2005/12/1

N2 - Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics.

AB - Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics.

UR - http://www.scopus.com/inward/record.url?scp=84879848684&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84879848684&partnerID=8YFLogxK

U2 - 10.1145/1076034.1076172

DO - 10.1145/1076034.1076172

M3 - Conference contribution

AN - SCOPUS:84879848684

SN - 1595930345

SN - 9781595930347

T3 - SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval

SP - 649

EP - 650

BT - SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval

ER -

Sharma H, Jansen BJ. Automated evaluation of search engine performance via implicit user feedback. In SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. 2005. p. 649-650. (SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval). https://doi.org/10.1145/1076034.1076172