Enhancing Pneumonia Diagnosis Through AI Interpretability: Comparative Analysis of Pixel-Level Interpretability and Grad-CAM on X-ray Imaging With VGG19

Pneumonia is a leading cause of morbidity and mortality worldwide, necessitating timely and precise diagnosis for effective treatment. Chest X-rays are the primary diagnostic tool, but their interpretation demands substantial expertise. Recent advancements in AI have shown promise in enhancing pneum...

Full description

Saved in:
Bibliographic Details
Main Authors: Mohammad Ennab, Hamid Mcheick
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Open Journal of the Computer Society
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11049939/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849421278468898816
author Mohammad Ennab
Hamid Mcheick
author_facet Mohammad Ennab
Hamid Mcheick
author_sort Mohammad Ennab
collection DOAJ
description Pneumonia is a leading cause of morbidity and mortality worldwide, necessitating timely and precise diagnosis for effective treatment. Chest X-rays are the primary diagnostic tool, but their interpretation demands substantial expertise. Recent advancements in AI have shown promise in enhancing pneumonia detection from X-ray images, yet the opacity of deep learning models raises concerns about their clinical adoption. Interpretability in AI models is vital for fostering trust among healthcare professionals by providing transparency in decision-making processes. This study conducts a comparative analysis of two interpretability methods, Pixel Level Interpretability (PLI) and Gradient-weighted Class Activation Mapping (Grad-CAM), in the context of pneumonia classification using VGG19 on X-ray datasets. The research includes an experiment involving three distinct X-ray datasets. VGG19 is applied to classify a query image, and both PLI and Grad-CAM are used to interpret the classification decisions. The study evaluates these interpretability methods across multiple dimensions: computational efficiency, diagnostic performance, explanation continuity, calibration accuracy, robustness to training parameters, and feedback from medical experts. Our findings aim to determine which interpretability technique offers a more clinically meaningful explanation, balancing computational feasibility and diagnostic reliability. This study contributes to the development of explainable AI in healthcare, supporting the integration of trustworthy AI systems in clinical environments for enhanced pneumonia diagnosis.
format Article
id doaj-art-07a00a3132b047cfba5ffaa722fd4086
institution Kabale University
issn 2644-1268
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Open Journal of the Computer Society
spelling doaj-art-07a00a3132b047cfba5ffaa722fd40862025-08-20T03:31:30ZengIEEEIEEE Open Journal of the Computer Society2644-12682025-01-0161155116510.1109/OJCS.2025.358272611049939Enhancing Pneumonia Diagnosis Through AI Interpretability: Comparative Analysis of Pixel-Level Interpretability and Grad-CAM on X-ray Imaging With VGG19Mohammad Ennab0https://orcid.org/0000-0001-7117-6271Hamid Mcheick1Department of Computer Sciences and Mathematics, University of Québec at Chicoutimi, 555 Bd de University, Chicoutimi, QC, CanadaDepartment of Computer Sciences and Mathematics, University of Québec at Chicoutimi, 555 Bd de University, Chicoutimi, QC, CanadaPneumonia is a leading cause of morbidity and mortality worldwide, necessitating timely and precise diagnosis for effective treatment. Chest X-rays are the primary diagnostic tool, but their interpretation demands substantial expertise. Recent advancements in AI have shown promise in enhancing pneumonia detection from X-ray images, yet the opacity of deep learning models raises concerns about their clinical adoption. Interpretability in AI models is vital for fostering trust among healthcare professionals by providing transparency in decision-making processes. This study conducts a comparative analysis of two interpretability methods, Pixel Level Interpretability (PLI) and Gradient-weighted Class Activation Mapping (Grad-CAM), in the context of pneumonia classification using VGG19 on X-ray datasets. The research includes an experiment involving three distinct X-ray datasets. VGG19 is applied to classify a query image, and both PLI and Grad-CAM are used to interpret the classification decisions. The study evaluates these interpretability methods across multiple dimensions: computational efficiency, diagnostic performance, explanation continuity, calibration accuracy, robustness to training parameters, and feedback from medical experts. Our findings aim to determine which interpretability technique offers a more clinically meaningful explanation, balancing computational feasibility and diagnostic reliability. This study contributes to the development of explainable AI in healthcare, supporting the integration of trustworthy AI systems in clinical environments for enhanced pneumonia diagnosis.https://ieeexplore.ieee.org/document/11049939/AI interpretabilitydeep learninggrad-CAMmachine learningmedical imagingPLI
spellingShingle Mohammad Ennab
Hamid Mcheick
Enhancing Pneumonia Diagnosis Through AI Interpretability: Comparative Analysis of Pixel-Level Interpretability and Grad-CAM on X-ray Imaging With VGG19
IEEE Open Journal of the Computer Society
AI interpretability
deep learning
grad-CAM
machine learning
medical imaging
PLI
title Enhancing Pneumonia Diagnosis Through AI Interpretability: Comparative Analysis of Pixel-Level Interpretability and Grad-CAM on X-ray Imaging With VGG19
title_full Enhancing Pneumonia Diagnosis Through AI Interpretability: Comparative Analysis of Pixel-Level Interpretability and Grad-CAM on X-ray Imaging With VGG19
title_fullStr Enhancing Pneumonia Diagnosis Through AI Interpretability: Comparative Analysis of Pixel-Level Interpretability and Grad-CAM on X-ray Imaging With VGG19
title_full_unstemmed Enhancing Pneumonia Diagnosis Through AI Interpretability: Comparative Analysis of Pixel-Level Interpretability and Grad-CAM on X-ray Imaging With VGG19
title_short Enhancing Pneumonia Diagnosis Through AI Interpretability: Comparative Analysis of Pixel-Level Interpretability and Grad-CAM on X-ray Imaging With VGG19
title_sort enhancing pneumonia diagnosis through ai interpretability comparative analysis of pixel level interpretability and grad cam on x ray imaging with vgg19
topic AI interpretability
deep learning
grad-CAM
machine learning
medical imaging
PLI
url https://ieeexplore.ieee.org/document/11049939/
work_keys_str_mv AT mohammadennab enhancingpneumoniadiagnosisthroughaiinterpretabilitycomparativeanalysisofpixellevelinterpretabilityandgradcamonxrayimagingwithvgg19
AT hamidmcheick enhancingpneumoniadiagnosisthroughaiinterpretabilitycomparativeanalysisofpixellevelinterpretabilityandgradcamonxrayimagingwithvgg19