Clinical validation of explainable AI for fetal growth scans through multi-level, cross-institutional prospective end-user evaluation
Abstract We aimed to develop and evaluate Explainable Artificial Intelligence (XAI) for fetal ultrasound using actionable concepts as feedback to end-users, using a prospective cross-center, multi-level approach. We developed, implemented, and tested a deep-learning model for fetal growth scans usin...
Saved in:
Main Authors: | Zahra Bashir, Manxi Lin, Aasa Feragen, Kamil Mikolaj, Caroline Taksøe-Vester, Anders Nymark Christensen, Morten B. S. Svendsen, Mette Hvilshøj Fabricius, Lisbeth Andreasen, Mads Nielsen, Martin Grønnebæk Tolsgaard |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Scientific Reports |
Subjects: | |
Online Access: | https://doi.org/10.1038/s41598-025-86536-4 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Integrating Explainable Artificial Intelligence in Extended Reality Environments: A Systematic Survey
by: Clara Maathuis, et al.
Published: (2025-01-01) -
Chinese Chat Room: AI Hallucinations, Epistemology and Cognition
by: Šekrst Kristina
Published: (2024-12-01) -
Explainable AI chatbots towards XAI ChatGPT: A review
by: Attila Kovari
Published: (2025-01-01) -
AI anxiety: Explication and exploration of effect on state anxiety when interacting with AI doctors
by: Hyun Yang, et al.
Published: (2025-03-01) -
On explaining recommendations with Large Language Models: a review
by: Alan Said
Published: (2025-01-01)