High-resolution fundus images for ophthalmomics and early cardiovascular disease prediction

Abstract Cardiovascular diseases (CVDs) remain the foremost cause of mortality globally, emphasizing the imperative for early detection to improve patient outcomes and mitigate healthcare burdens. Carotid intima-media thickness (CIMT) serves as a well-established predictive marker for atherosclerosi...

Full description

Saved in:
Bibliographic Details
Main Authors: Na Guo, Wanjin Fu, Heng Li, Haoyun Zhang, Tiantian Li, Wei Zhang, Xing Zhong, Tianrong Pan, Fuchun Sun, Ajuan Gong
Format: Article
Language:English
Published: Nature Portfolio 2025-04-01
Series:Scientific Data
Online Access:https://doi.org/10.1038/s41597-025-04930-z
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Cardiovascular diseases (CVDs) remain the foremost cause of mortality globally, emphasizing the imperative for early detection to improve patient outcomes and mitigate healthcare burdens. Carotid intima-media thickness (CIMT) serves as a well-established predictive marker for atherosclerosis and cardiovascular risk assessment. Fundus imaging offers a non-invasive modality to investigate microvascular pathology and systemic vascular health. However, the paucity of high-quality, publicly available datasets linking fundus images with CIMT measurements has hindered the progression of AI-driven predictive models for CVDs. Addressing this gap, we introduce the China-Fundus-CIMT dataset, comprising bilateral high-resolution fundus images, CIMT measurements, and clinical data—including age and gender—from 2,903 patients. Our experiments with multimodal models reveal that integrating clinical information substantially enhances predictive performance, yielding AUC-ROC increases of 3.22% and 7.83% on the validation and test sets, respectively, compared to unimodal models. This dataset constitutes a vital resource for developing and validating AI-based early screening models for CVDs using fundus images and is now accessible to the research community.
ISSN:2052-4463