Should Machines Express Sympathy and Empathy? Experiments with a Health Advice Chatbot

Bingjie Liu, S. Shyam Sundar

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

When we ask a chatbot for advice about a personal problem, should it simply provide informational support and refrain from offering emotional support? Or, should it show sympathy and empathize with our situation? Although expression of caring and understanding is valued in supportive human communications, do we want the same from a chatbot, or do we simply reject it due to its artificiality and uncanniness? To answer this question, we conducted two experiments with a chatbot providing online medical information advice about a sensitive personal issue. In Study 1, participants (N = 158) simply read a dialogue between a chatbot and a human user. In Study 2, participants (N = 88) interacted with a real chatbot. We tested the effect of three types of empathic expression - sympathy, cognitive empathy, and affective empathy - on individuals' perceptions of the service and the chatbot. Data reveal that expression of sympathy and empathy is favored over unemotional provision of advice, in support of the Computers are Social Actors (CASA) paradigm. This is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities. Theoretical, methodological, and practical implications are discussed.

Original languageEnglish (US)
Pages (from-to)625-636
Number of pages12
JournalCyberpsychology, Behavior, and Social Networking
Volume21
Issue number10
DOIs
StatePublished - Oct 1 2018

Fingerprint

sympathy
empathy
Health
Communication
experiment
health
Experiments
social actor
communications
dialogue
paradigm

All Science Journal Classification (ASJC) codes

  • Social Psychology
  • Communication
  • Applied Psychology
  • Human-Computer Interaction
  • Computer Science Applications

Cite this

@article{272c75d302234827bed6babc04947352,
title = "Should Machines Express Sympathy and Empathy? Experiments with a Health Advice Chatbot",
abstract = "When we ask a chatbot for advice about a personal problem, should it simply provide informational support and refrain from offering emotional support? Or, should it show sympathy and empathize with our situation? Although expression of caring and understanding is valued in supportive human communications, do we want the same from a chatbot, or do we simply reject it due to its artificiality and uncanniness? To answer this question, we conducted two experiments with a chatbot providing online medical information advice about a sensitive personal issue. In Study 1, participants (N = 158) simply read a dialogue between a chatbot and a human user. In Study 2, participants (N = 88) interacted with a real chatbot. We tested the effect of three types of empathic expression - sympathy, cognitive empathy, and affective empathy - on individuals' perceptions of the service and the chatbot. Data reveal that expression of sympathy and empathy is favored over unemotional provision of advice, in support of the Computers are Social Actors (CASA) paradigm. This is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities. Theoretical, methodological, and practical implications are discussed.",
author = "Bingjie Liu and Sundar, {S. Shyam}",
year = "2018",
month = "10",
day = "1",
doi = "10.1089/cyber.2018.0110",
language = "English (US)",
volume = "21",
pages = "625--636",
journal = "Cyberpsychology, Behavior, and Social Networking",
issn = "2152-2715",
publisher = "Mary Ann Liebert Inc.",
number = "10",

}

Should Machines Express Sympathy and Empathy? Experiments with a Health Advice Chatbot. / Liu, Bingjie; Sundar, S. Shyam.

In: Cyberpsychology, Behavior, and Social Networking, Vol. 21, No. 10, 01.10.2018, p. 625-636.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Should Machines Express Sympathy and Empathy? Experiments with a Health Advice Chatbot

AU - Liu, Bingjie

AU - Sundar, S. Shyam

PY - 2018/10/1

Y1 - 2018/10/1

N2 - When we ask a chatbot for advice about a personal problem, should it simply provide informational support and refrain from offering emotional support? Or, should it show sympathy and empathize with our situation? Although expression of caring and understanding is valued in supportive human communications, do we want the same from a chatbot, or do we simply reject it due to its artificiality and uncanniness? To answer this question, we conducted two experiments with a chatbot providing online medical information advice about a sensitive personal issue. In Study 1, participants (N = 158) simply read a dialogue between a chatbot and a human user. In Study 2, participants (N = 88) interacted with a real chatbot. We tested the effect of three types of empathic expression - sympathy, cognitive empathy, and affective empathy - on individuals' perceptions of the service and the chatbot. Data reveal that expression of sympathy and empathy is favored over unemotional provision of advice, in support of the Computers are Social Actors (CASA) paradigm. This is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities. Theoretical, methodological, and practical implications are discussed.

AB - When we ask a chatbot for advice about a personal problem, should it simply provide informational support and refrain from offering emotional support? Or, should it show sympathy and empathize with our situation? Although expression of caring and understanding is valued in supportive human communications, do we want the same from a chatbot, or do we simply reject it due to its artificiality and uncanniness? To answer this question, we conducted two experiments with a chatbot providing online medical information advice about a sensitive personal issue. In Study 1, participants (N = 158) simply read a dialogue between a chatbot and a human user. In Study 2, participants (N = 88) interacted with a real chatbot. We tested the effect of three types of empathic expression - sympathy, cognitive empathy, and affective empathy - on individuals' perceptions of the service and the chatbot. Data reveal that expression of sympathy and empathy is favored over unemotional provision of advice, in support of the Computers are Social Actors (CASA) paradigm. This is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities. Theoretical, methodological, and practical implications are discussed.

UR - http://www.scopus.com/inward/record.url?scp=85055080779&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85055080779&partnerID=8YFLogxK

U2 - 10.1089/cyber.2018.0110

DO - 10.1089/cyber.2018.0110

M3 - Article

C2 - 30334655

AN - SCOPUS:85055080779

VL - 21

SP - 625

EP - 636

JO - Cyberpsychology, Behavior, and Social Networking

JF - Cyberpsychology, Behavior, and Social Networking

SN - 2152-2715

IS - 10

ER -