Explanations as mechanisms for supporting algorithmic transparency

Emilee Rader, Kelley Cotter, Janghee Cho

Research output: Chapter in Book/Report/Conference proceedingConference contribution

97 Scopus citations

Abstract

Transparency can empower users to make informed choices about how they use an algorithmic decision-making system and judge its potential consequences. However, transparency is often conceptualized by the outcomes it is intended to bring about, not the specifics of mechanisms to achieve those outcomes. We conducted an online experiment focusing on how different ways of explaining Facebook's News Feed algorithm might affect participants' beliefs and judgments about the News Feed. We found that all explanations caused participants to become more aware of how the system works, and helped them to determine whether the system is biased and if they can control what they see. The explanations were less effective for helping participants evaluate the correctness of the system's output, and form opinions about how sensible and consistent its behavior is. We present implications for the design of transparency mechanisms in algorithmic decision-making systems based on these results.

Original languageEnglish (US)
Title of host publicationCHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems
Subtitle of host publicationEngage with CHI
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450356206, 9781450356213
DOIs
StatePublished - Apr 20 2018
Event2018 CHI Conference on Human Factors in Computing Systems, CHI 2018 - Montreal, Canada
Duration: Apr 21 2018Apr 26 2018

Publication series

NameConference on Human Factors in Computing Systems - Proceedings
Volume2018-April

Other

Other2018 CHI Conference on Human Factors in Computing Systems, CHI 2018
Country/TerritoryCanada
CityMontreal
Period4/21/184/26/18

All Science Journal Classification (ASJC) codes

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Cite this