TY - GEN
T1 - Exploring the Effects of Interactive Dialogue in Improving User Control for Explainable Online Symptom Checkers
AU - Sun, Yuan
AU - Sundar, S. Shyam
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/4/27
Y1 - 2022/4/27
N2 - There has been a major push to improve the transparency of online symptom checkers (OSCs) by providing more explanations to users about their functioning and conclusions. However, not all users will want explanations about all aspects of these systems. A more user-centered approach is necessary for personalizing user experience of explanations. With this in mind, we designed and tested an interactive dialogue interface to afford user control to receive only those explanations that they would like to read. A user study (N = 152) with a text-based chatbot for assessing anxiety levels and presented explanations to participants in one of the three forms-an interactive dialogue providing choice for viewing different components of the explanations, a static disclosure of all explanations, and a control condition with no explanations whatsoever. We found that participants varied in the kinds of information they wanted to learn. The interactive delivery of explanations led to higher levels of perceived transparency and affective trust in the system. Furthermore, both subjective and objective understanding of the mechanism used for assessing anxiety was higher for participants in the interactive dialogue condition. We discuss theoretical and practical implications of imbuing interactivity for enhancing the effectiveness of explainable systems.
AB - There has been a major push to improve the transparency of online symptom checkers (OSCs) by providing more explanations to users about their functioning and conclusions. However, not all users will want explanations about all aspects of these systems. A more user-centered approach is necessary for personalizing user experience of explanations. With this in mind, we designed and tested an interactive dialogue interface to afford user control to receive only those explanations that they would like to read. A user study (N = 152) with a text-based chatbot for assessing anxiety levels and presented explanations to participants in one of the three forms-an interactive dialogue providing choice for viewing different components of the explanations, a static disclosure of all explanations, and a control condition with no explanations whatsoever. We found that participants varied in the kinds of information they wanted to learn. The interactive delivery of explanations led to higher levels of perceived transparency and affective trust in the system. Furthermore, both subjective and objective understanding of the mechanism used for assessing anxiety was higher for participants in the interactive dialogue condition. We discuss theoretical and practical implications of imbuing interactivity for enhancing the effectiveness of explainable systems.
UR - http://www.scopus.com/inward/record.url?scp=85129768964&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85129768964&partnerID=8YFLogxK
U2 - 10.1145/3491101.3519668
DO - 10.1145/3491101.3519668
M3 - Conference contribution
AN - SCOPUS:85129768964
T3 - Conference on Human Factors in Computing Systems - Proceedings
BT - CHI 2022 - Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems
PB - Association for Computing Machinery
T2 - 2022 CHI Conference on Human Factors in Computing Systems, CHI EA 2022
Y2 - 30 April 2022 through 5 May 2022
ER -