This article investigates the challenge of developing a robot capable of determining if a social situation demands trust. Solving this challenge may allow a robot to react when a person over or under trusts the system. Prior work in this area has focused on understanding the factors that influence a person's trust of a robot (Hancock, et al., 2011). In contrast, by using game-theoretic representations to frame the problem, we are able to develop a set of conditions for determining if an interactive situation demands trust. In two separate experiments, human subjects were asked to evaluate either written narratives or mazes in terms of whether or not they require trust. The results indicate a φ1= +0.592 and φ2 = +0.406 correlation respectively between the subjects' evaluations and the condition's predictions. This is a strong correlation for a study involving human subjects.
All Science Journal Classification (ASJC) codes
- Language and Linguistics
- Animal Science and Zoology
- Linguistics and Language
- Human-Computer Interaction