Performance rather than reputation affects humans’ trust towards an artificial agent

To succeed in teamwork with artificial agents, humans have to calibrate their trust towards agents based on information they receive about an agent before interaction (reputation information) as well as on experiences they have during interaction (agent performance). This study (N = 253) focused on...

Full description

Saved in:
Bibliographic Details
Main Authors: Fritz Becker, Celine Ina Spannagl, Jürgen Buder, Markus Huff
Format: Article
Language:English
Published: Elsevier 2025-03-01
Series:Computers in Human Behavior: Artificial Humans
Online Access:http://www.sciencedirect.com/science/article/pii/S2949882125000064
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832572996533878784
author Fritz Becker
Celine Ina Spannagl
Jürgen Buder
Markus Huff
author_facet Fritz Becker
Celine Ina Spannagl
Jürgen Buder
Markus Huff
author_sort Fritz Becker
collection DOAJ
description To succeed in teamwork with artificial agents, humans have to calibrate their trust towards agents based on information they receive about an agent before interaction (reputation information) as well as on experiences they have during interaction (agent performance). This study (N = 253) focused on the influence of a virtual agent's reputation (high/low) and actual observed performance (high/low) on a human user's behavioral trust (delegation behavior) and self-reported trust (questionnaires) in a cooperative Tetris game. The main findings suggested that agent reputation influences self-reported trust prior to interaction. However, the effect of reputation immediately got overridden by performance of the agent during the interaction. The agent's performance during the interactive task influenced delegation behavior, as well as self-reported trust measured post-interaction. Pre-to post-change in self-reported trust was significantly larger when reputation and performance were incongruent. We concluded that reputation might have had a smaller than expected influence on behavior in the presence of a novel tool that afforded exploration. Our research contributes to understanding trust and delegation dynamics, which is crucial for the design and adequate use of artificial agent team partners in a world of digital transformation.
format Article
id doaj-art-2817ea0e63e746d0aac735e17d997b13
institution Kabale University
issn 2949-8821
language English
publishDate 2025-03-01
publisher Elsevier
record_format Article
series Computers in Human Behavior: Artificial Humans
spelling doaj-art-2817ea0e63e746d0aac735e17d997b132025-02-02T05:29:38ZengElsevierComputers in Human Behavior: Artificial Humans2949-88212025-03-013100122Performance rather than reputation affects humans’ trust towards an artificial agentFritz Becker0Celine Ina Spannagl1Jürgen Buder2Markus Huff3Leibniz-Institut für Wissensmedien, Tübingen, Germany; Corresponding author. Leibniz-Institut für Wissensmedien, Tübingen, Germany.Department of Psychology, Universität Tübingen, GermanyLeibniz-Institut für Wissensmedien, Tübingen, GermanyLeibniz-Institut für Wissensmedien, Tübingen, Germany; Department of Psychology, Universität Tübingen, GermanyTo succeed in teamwork with artificial agents, humans have to calibrate their trust towards agents based on information they receive about an agent before interaction (reputation information) as well as on experiences they have during interaction (agent performance). This study (N = 253) focused on the influence of a virtual agent's reputation (high/low) and actual observed performance (high/low) on a human user's behavioral trust (delegation behavior) and self-reported trust (questionnaires) in a cooperative Tetris game. The main findings suggested that agent reputation influences self-reported trust prior to interaction. However, the effect of reputation immediately got overridden by performance of the agent during the interaction. The agent's performance during the interactive task influenced delegation behavior, as well as self-reported trust measured post-interaction. Pre-to post-change in self-reported trust was significantly larger when reputation and performance were incongruent. We concluded that reputation might have had a smaller than expected influence on behavior in the presence of a novel tool that afforded exploration. Our research contributes to understanding trust and delegation dynamics, which is crucial for the design and adequate use of artificial agent team partners in a world of digital transformation.http://www.sciencedirect.com/science/article/pii/S2949882125000064
spellingShingle Fritz Becker
Celine Ina Spannagl
Jürgen Buder
Markus Huff
Performance rather than reputation affects humans’ trust towards an artificial agent
Computers in Human Behavior: Artificial Humans
title Performance rather than reputation affects humans’ trust towards an artificial agent
title_full Performance rather than reputation affects humans’ trust towards an artificial agent
title_fullStr Performance rather than reputation affects humans’ trust towards an artificial agent
title_full_unstemmed Performance rather than reputation affects humans’ trust towards an artificial agent
title_short Performance rather than reputation affects humans’ trust towards an artificial agent
title_sort performance rather than reputation affects humans trust towards an artificial agent
url http://www.sciencedirect.com/science/article/pii/S2949882125000064
work_keys_str_mv AT fritzbecker performanceratherthanreputationaffectshumanstrusttowardsanartificialagent
AT celineinaspannagl performanceratherthanreputationaffectshumanstrusttowardsanartificialagent
AT jurgenbuder performanceratherthanreputationaffectshumanstrusttowardsanartificialagent
AT markushuff performanceratherthanreputationaffectshumanstrusttowardsanartificialagent