Self-assessment in machines boosts human Trust
Low trust in autonomous systems remains a significant barrier to adoption and performance. To effectively increase trust in these systems, machines must perform actions to calibrate human trust based on an accurate assessment of both their capability and human trust in real time. Existing efforts de...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Frontiers Media S.A.
2025-05-01
|
| Series: | Frontiers in Robotics and AI |
| Subjects: | |
| Online Access: | https://www.frontiersin.org/articles/10.3389/frobt.2025.1557075/full |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Low trust in autonomous systems remains a significant barrier to adoption and performance. To effectively increase trust in these systems, machines must perform actions to calibrate human trust based on an accurate assessment of both their capability and human trust in real time. Existing efforts demonstrate the value of trust calibration in improving team performance but overlook the importance of machine self-assessment capabilities in the trust calibration process. In our work, we develop a closed-loop trust calibration system for a human-machine collaboration task to classify images and demonstrate about 40% improvement in human trust and 5% improvement in team performance with trained machine self-assessment compared to the baseline, despite the same machine performance level between them. Our trust calibration system applies to any semi-autonomous application requiring human-machine collaboration. |
|---|---|
| ISSN: | 2296-9144 |