Rating mechanisms for sustainability of crowdsourcing platforms

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Crowdsourcing leverages the diverse skill sets of large collections of individual contributors to solve problems and execute projects, where contributors may vary significantly in experience, expertise, and interest in completing tasks. Hence, to ensure the satisfaction of its task requesters, most existing crowdsourcing platforms focus primarily on supervising contributors' behavior. This lopsided approach to supervision negatively impacts contributor engagement and platform sustainability. In this paper, we introduce rating mechanisms to evaluate requesters' behavior, such that the health and sustainability of crowdsourcing platform can be improved. We build a game theoretical model to systematically account for the different goals of requesters, contributors, and platform, and their interactions. On the basis of this model, we focus on a specific application, in which we aim to design a rating policy that incentivizes requesters to engage less-experienced contributors. Considering the hardness of the problem, we develop a time efficient heuristic algorithm with theoretical bound analysis. Finally, we conduct a user study in Amazon Mechanical Turk (MTurk) to validate the central hypothesis of the model. We provide a simulation based on 3 million task records extracted from MTurk demonstrating that our rating policy can appreciably motivate requesters to hire less-experienced contributors.

Original languageEnglish (US)
Title of host publicationCIKM 2019 - Proceedings of the 28th ACM International Conference on Information and Knowledge Management
PublisherAssociation for Computing Machinery
Pages2003-2012
Number of pages10
ISBN (Electronic)9781450369763
DOIs
Publication statusPublished - Nov 3 2019
Event28th ACM International Conference on Information and Knowledge Management, CIKM 2019 - Beijing, China
Duration: Nov 3 2019Nov 7 2019

Publication series

NameInternational Conference on Information and Knowledge Management, Proceedings

Conference

Conference28th ACM International Conference on Information and Knowledge Management, CIKM 2019
CountryChina
CityBeijing
Period11/3/1911/7/19

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Business, Management and Accounting(all)
  • Decision Sciences(all)

Cite this

Qiu, C., Squicciarini, A., & Rajtmajer, S. (2019). Rating mechanisms for sustainability of crowdsourcing platforms. In CIKM 2019 - Proceedings of the 28th ACM International Conference on Information and Knowledge Management (pp. 2003-2012). (International Conference on Information and Knowledge Management, Proceedings). Association for Computing Machinery. https://doi.org/10.1145/3357384.3357933