Visual Saliency Design for AR-HUD Navigation in Extreme Weather: Reducing Inattentional Blindness

Background: Current research on AR-HUD visual icons focuses primarily on interface, color, and motion trajectory design. However, when displayed in complex environments, AR-HUD might cause inattentional blindness, particularly during extreme weather conditions where occurrence rates increase signifi...

Full description

Saved in:
Bibliographic Details
Main Authors: Qi Zhu, Jiale Li, Yixiang Liu
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11079581/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849392659238486016
author Qi Zhu
Jiale Li
Yixiang Liu
author_facet Qi Zhu
Jiale Li
Yixiang Liu
author_sort Qi Zhu
collection DOAJ
description Background: Current research on AR-HUD visual icons focuses primarily on interface, color, and motion trajectory design. However, when displayed in complex environments, AR-HUD might cause inattentional blindness, particularly during extreme weather conditions where occurrence rates increase significantly. Methods: This study employed a two-factor within-subjects design to examine how dynamic saliency and contour saliency in AR-HUD navigation graphics affect inattentional blindness and user experience. Eye-tracking metrics with total fixation time and first fixation time, driver reaction time, and user experience scores were analyzed. Results: Objective measures revealed significant interaction effects between saliency types, though user experience feedback showed no significance. Both contour and dynamic saliency effectively reduced inattentional blindness and improved information-seeking efficiency. Conclusions: The combination of dynamic and contour saliency produced the lowest inattentional blindness rates, fastest reaction times, and highest user satisfaction. We recommend: 1) adding distinct contours to prevent environmental visual fusion, 2) implementing temporary icon enlargement during critical moments to enhance attentional capture. These findings provide actionable design principles for improving AR-HUD safety in adverse weather conditions.
format Article
id doaj-art-e1cf2569aafa42e9bcea284ec5eb530e
institution Kabale University
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-e1cf2569aafa42e9bcea284ec5eb530e2025-08-20T03:40:43ZengIEEEIEEE Access2169-35362025-01-011313761313762210.1109/ACCESS.2025.358857611079581Visual Saliency Design for AR-HUD Navigation in Extreme Weather: Reducing Inattentional BlindnessQi Zhu0Jiale Li1Yixiang Liu2https://orcid.org/0009-0006-4091-1636College of Design, Hanyang University, Ansan, Gyeonggi-do, South KoreaDepartment of Media and Communication Studies, Faculty of Arts and Social Sciences, University of Malaya, Kuala Lumpur, MalaysiaCollege of Arts and Media, Qingdao Binhai University, Qingdao, ChinaBackground: Current research on AR-HUD visual icons focuses primarily on interface, color, and motion trajectory design. However, when displayed in complex environments, AR-HUD might cause inattentional blindness, particularly during extreme weather conditions where occurrence rates increase significantly. Methods: This study employed a two-factor within-subjects design to examine how dynamic saliency and contour saliency in AR-HUD navigation graphics affect inattentional blindness and user experience. Eye-tracking metrics with total fixation time and first fixation time, driver reaction time, and user experience scores were analyzed. Results: Objective measures revealed significant interaction effects between saliency types, though user experience feedback showed no significance. Both contour and dynamic saliency effectively reduced inattentional blindness and improved information-seeking efficiency. Conclusions: The combination of dynamic and contour saliency produced the lowest inattentional blindness rates, fastest reaction times, and highest user satisfaction. We recommend: 1) adding distinct contours to prevent environmental visual fusion, 2) implementing temporary icon enlargement during critical moments to enhance attentional capture. These findings provide actionable design principles for improving AR-HUD safety in adverse weather conditions.https://ieeexplore.ieee.org/document/11079581/AR-HUDinattentional blindnessextreme weathereye-trackingdriver performancevisual saliency
spellingShingle Qi Zhu
Jiale Li
Yixiang Liu
Visual Saliency Design for AR-HUD Navigation in Extreme Weather: Reducing Inattentional Blindness
IEEE Access
AR-HUD
inattentional blindness
extreme weather
eye-tracking
driver performance
visual saliency
title Visual Saliency Design for AR-HUD Navigation in Extreme Weather: Reducing Inattentional Blindness
title_full Visual Saliency Design for AR-HUD Navigation in Extreme Weather: Reducing Inattentional Blindness
title_fullStr Visual Saliency Design for AR-HUD Navigation in Extreme Weather: Reducing Inattentional Blindness
title_full_unstemmed Visual Saliency Design for AR-HUD Navigation in Extreme Weather: Reducing Inattentional Blindness
title_short Visual Saliency Design for AR-HUD Navigation in Extreme Weather: Reducing Inattentional Blindness
title_sort visual saliency design for ar hud navigation in extreme weather reducing inattentional blindness
topic AR-HUD
inattentional blindness
extreme weather
eye-tracking
driver performance
visual saliency
url https://ieeexplore.ieee.org/document/11079581/
work_keys_str_mv AT qizhu visualsaliencydesignforarhudnavigationinextremeweatherreducinginattentionalblindness
AT jialeli visualsaliencydesignforarhudnavigationinextremeweatherreducinginattentionalblindness
AT yixiangliu visualsaliencydesignforarhudnavigationinextremeweatherreducinginattentionalblindness