Lies and deception

Robots that use falsehood as a social strategy

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Dishonesty is found throughout normal interpersonal interaction. A lie is a specific type of dishonesty. A commonly accepted definition of the term “lie” is a false statement made by an individual which knows that the statement is not true (Carson 2006). This chapter explores the computational and social-psychological underpinnings that enable a robot to utter lies using a framework that we already applied to non-verbal deception. We use the interdependence framework as the foundation for analyzing various types of lies, because it provides conceptual tools for understanding the role of the situation and the robot’s disposition in determining whether or not to lie. The results of two tests of a robot playing a card game with a human support our hypotheses that 1) the interdependence framework can be applied to lying; 2) the application of this framework provides a basis for understanding factors that shape someone’s decision to lie; and 3) an individual’s history influences their decision to lie. Our findings also demonstrate that stereotyped partner models can be used to bootstrap a robot’s evaluation of the costs and benefits of lying as well as the likelihood that an individual will challenge the truth of the robot’s statements. We conclude the chapter with suggestions for future work.

Original languageEnglish (US)
Title of host publicationRobots that Talk and Listen
Subtitle of host publicationTechnology and Social Impact
PublisherWalter de Gruyter GmbH
Pages203-225
Number of pages23
ISBN (Electronic)9781614514404
ISBN (Print)9781614516033
DOIs
StatePublished - Jan 1 2015

Fingerprint

robot
Robots
interdependence
Falsehood
Deception
Robot
disposition
Costs
history
costs
interaction
evaluation
Interdependence

All Science Journal Classification (ASJC) codes

  • Engineering(all)
  • Computer Science(all)
  • Arts and Humanities(all)
  • Social Sciences(all)

Cite this

Wagner, A. R. (2015). Lies and deception: Robots that use falsehood as a social strategy. In Robots that Talk and Listen: Technology and Social Impact (pp. 203-225). Walter de Gruyter GmbH. https://doi.org/10.1515/9781614514404.173
Wagner, Alan Richard. / Lies and deception : Robots that use falsehood as a social strategy. Robots that Talk and Listen: Technology and Social Impact. Walter de Gruyter GmbH, 2015. pp. 203-225
@inbook{4640805063f7478bb6386b5c70aa41ac,
title = "Lies and deception: Robots that use falsehood as a social strategy",
abstract = "Dishonesty is found throughout normal interpersonal interaction. A lie is a specific type of dishonesty. A commonly accepted definition of the term “lie” is a false statement made by an individual which knows that the statement is not true (Carson 2006). This chapter explores the computational and social-psychological underpinnings that enable a robot to utter lies using a framework that we already applied to non-verbal deception. We use the interdependence framework as the foundation for analyzing various types of lies, because it provides conceptual tools for understanding the role of the situation and the robot’s disposition in determining whether or not to lie. The results of two tests of a robot playing a card game with a human support our hypotheses that 1) the interdependence framework can be applied to lying; 2) the application of this framework provides a basis for understanding factors that shape someone’s decision to lie; and 3) an individual’s history influences their decision to lie. Our findings also demonstrate that stereotyped partner models can be used to bootstrap a robot’s evaluation of the costs and benefits of lying as well as the likelihood that an individual will challenge the truth of the robot’s statements. We conclude the chapter with suggestions for future work.",
author = "Wagner, {Alan Richard}",
year = "2015",
month = "1",
day = "1",
doi = "10.1515/9781614514404.173",
language = "English (US)",
isbn = "9781614516033",
pages = "203--225",
booktitle = "Robots that Talk and Listen",
publisher = "Walter de Gruyter GmbH",
address = "Germany",

}

Wagner, AR 2015, Lies and deception: Robots that use falsehood as a social strategy. in Robots that Talk and Listen: Technology and Social Impact. Walter de Gruyter GmbH, pp. 203-225. https://doi.org/10.1515/9781614514404.173

Lies and deception : Robots that use falsehood as a social strategy. / Wagner, Alan Richard.

Robots that Talk and Listen: Technology and Social Impact. Walter de Gruyter GmbH, 2015. p. 203-225.

Research output: Chapter in Book/Report/Conference proceedingChapter

TY - CHAP

T1 - Lies and deception

T2 - Robots that use falsehood as a social strategy

AU - Wagner, Alan Richard

PY - 2015/1/1

Y1 - 2015/1/1

N2 - Dishonesty is found throughout normal interpersonal interaction. A lie is a specific type of dishonesty. A commonly accepted definition of the term “lie” is a false statement made by an individual which knows that the statement is not true (Carson 2006). This chapter explores the computational and social-psychological underpinnings that enable a robot to utter lies using a framework that we already applied to non-verbal deception. We use the interdependence framework as the foundation for analyzing various types of lies, because it provides conceptual tools for understanding the role of the situation and the robot’s disposition in determining whether or not to lie. The results of two tests of a robot playing a card game with a human support our hypotheses that 1) the interdependence framework can be applied to lying; 2) the application of this framework provides a basis for understanding factors that shape someone’s decision to lie; and 3) an individual’s history influences their decision to lie. Our findings also demonstrate that stereotyped partner models can be used to bootstrap a robot’s evaluation of the costs and benefits of lying as well as the likelihood that an individual will challenge the truth of the robot’s statements. We conclude the chapter with suggestions for future work.

AB - Dishonesty is found throughout normal interpersonal interaction. A lie is a specific type of dishonesty. A commonly accepted definition of the term “lie” is a false statement made by an individual which knows that the statement is not true (Carson 2006). This chapter explores the computational and social-psychological underpinnings that enable a robot to utter lies using a framework that we already applied to non-verbal deception. We use the interdependence framework as the foundation for analyzing various types of lies, because it provides conceptual tools for understanding the role of the situation and the robot’s disposition in determining whether or not to lie. The results of two tests of a robot playing a card game with a human support our hypotheses that 1) the interdependence framework can be applied to lying; 2) the application of this framework provides a basis for understanding factors that shape someone’s decision to lie; and 3) an individual’s history influences their decision to lie. Our findings also demonstrate that stereotyped partner models can be used to bootstrap a robot’s evaluation of the costs and benefits of lying as well as the likelihood that an individual will challenge the truth of the robot’s statements. We conclude the chapter with suggestions for future work.

UR - http://www.scopus.com/inward/record.url?scp=84960271782&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84960271782&partnerID=8YFLogxK

U2 - 10.1515/9781614514404.173

DO - 10.1515/9781614514404.173

M3 - Chapter

SN - 9781614516033

SP - 203

EP - 225

BT - Robots that Talk and Listen

PB - Walter de Gruyter GmbH

ER -

Wagner AR. Lies and deception: Robots that use falsehood as a social strategy. In Robots that Talk and Listen: Technology and Social Impact. Walter de Gruyter GmbH. 2015. p. 203-225 https://doi.org/10.1515/9781614514404.173