Socially excluded employees prefer algorithmic evaluation to human assessment: The moderating role of an interdependent culture

Organizations have embraced artificial intelligence (AI) technology for personnel assessments such as document screening, interviews, and evaluations. However, some studies have reported employees' aversive reactions to AI-based assessment, while others have shown their appreciation for AI. Thi...

Full description

Saved in:
Bibliographic Details
Main Authors: Yoko Sugitani, Taku Togawa, Kosuke Motoki
Format: Article
Language:English
Published: Elsevier 2025-05-01
Series:Computers in Human Behavior: Artificial Humans
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2949882125000362
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Organizations have embraced artificial intelligence (AI) technology for personnel assessments such as document screening, interviews, and evaluations. However, some studies have reported employees' aversive reactions to AI-based assessment, while others have shown their appreciation for AI. This study focused on the effect of workplace social context, specifically social exclusion, on employees’ attitudes toward AI-based personnel assessment. Drawing on cognitive dissonance theory, we hypothesized that socially excluded employees perceive human evaluation as unfair, leading to their belief that AI-based assessments are fairer and, in turn, a favorable attitude toward AI evaluation. Through three experiments wherein workplace social relationships (social exclusion vs. inclusion) were manipulated, we demonstrated that socially excluded employees showed a higher positive attitude toward algorithmic assessment compared with those who were socially included. Further, this effect was mediated by perceived fairness of AI assessment, and more evident in an interdependent (but not independent) self-construal culture. These findings offer novel insights into psychological research on computer use in professional practices.
ISSN:2949-8821