Multimodal multi-instance evidence fusion neural networks for cancer survival prediction
Abstract Accurate cancer survival prediction plays a crucial role in assisting clinicians in formulating treatment plans. Multimodal data, such as histopathological images, genomic data, and clinical information, provide complementary and comprehensive information, significantly enhancing the accura...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-03-01
|
| Series: | Scientific Reports |
| Subjects: | |
| Online Access: | https://doi.org/10.1038/s41598-025-93770-3 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850208140584288256 |
|---|---|
| author | Hui Luo Jiashuang Huang Hengrong Ju Tianyi Zhou Weiping Ding |
| author_facet | Hui Luo Jiashuang Huang Hengrong Ju Tianyi Zhou Weiping Ding |
| author_sort | Hui Luo |
| collection | DOAJ |
| description | Abstract Accurate cancer survival prediction plays a crucial role in assisting clinicians in formulating treatment plans. Multimodal data, such as histopathological images, genomic data, and clinical information, provide complementary and comprehensive information, significantly enhancing the accuracy of this task. However, existing methods, despite achieving some promising results, still exhibit two significant limitations: they fail to effectively utilize global context and overlook the uncertainty of different modalities, which may lead to unreliable predictions. In this study, we propose a multimodal multi-instance evidence fusion neural network for cancer survival prediction, called M2EF-NNs. Specifically, to better capture global information from images, we employ a pre-trained vision transformer model to extract patch feature embeddings from histopathological images. Additionally, we are the first to apply the Dempster–Shafer evidence theory to the cancer survival prediction task and introduce subjective logic to estimate the uncertainty of different modalities. We then dynamically adjust the weights of the class probability distribution after multimodal fusion based on the estimated evidence from the fused multimodal data to achieve trusted survival prediction. Finally, the experimental results on three cancer datasets demonstrate that our method significantly improves cancer survival prediction regarding overall C-index and AUC, thereby validating the model’s reliability. |
| format | Article |
| id | doaj-art-bcf5fc4ecaaa4b948852b2d07ee8dcae |
| institution | OA Journals |
| issn | 2045-2322 |
| language | English |
| publishDate | 2025-03-01 |
| publisher | Nature Portfolio |
| record_format | Article |
| series | Scientific Reports |
| spelling | doaj-art-bcf5fc4ecaaa4b948852b2d07ee8dcae2025-08-20T02:10:17ZengNature PortfolioScientific Reports2045-23222025-03-0115111510.1038/s41598-025-93770-3Multimodal multi-instance evidence fusion neural networks for cancer survival predictionHui Luo0Jiashuang Huang1Hengrong Ju2Tianyi Zhou3Weiping Ding4Faculty of Data Science, City University of MacauSchool of Artificial Intelligence and Computer Science, Nantong UniversitySchool of Artificial Intelligence and Computer Science, Nantong UniversitySchool of Artificial Intelligence and Computer Science, Nantong UniversityFaculty of Data Science, City University of MacauAbstract Accurate cancer survival prediction plays a crucial role in assisting clinicians in formulating treatment plans. Multimodal data, such as histopathological images, genomic data, and clinical information, provide complementary and comprehensive information, significantly enhancing the accuracy of this task. However, existing methods, despite achieving some promising results, still exhibit two significant limitations: they fail to effectively utilize global context and overlook the uncertainty of different modalities, which may lead to unreliable predictions. In this study, we propose a multimodal multi-instance evidence fusion neural network for cancer survival prediction, called M2EF-NNs. Specifically, to better capture global information from images, we employ a pre-trained vision transformer model to extract patch feature embeddings from histopathological images. Additionally, we are the first to apply the Dempster–Shafer evidence theory to the cancer survival prediction task and introduce subjective logic to estimate the uncertainty of different modalities. We then dynamically adjust the weights of the class probability distribution after multimodal fusion based on the estimated evidence from the fused multimodal data to achieve trusted survival prediction. Finally, the experimental results on three cancer datasets demonstrate that our method significantly improves cancer survival prediction regarding overall C-index and AUC, thereby validating the model’s reliability.https://doi.org/10.1038/s41598-025-93770-3Survival predictionMultimodal fusionVision transformerDempster–Shafer evidence theory |
| spellingShingle | Hui Luo Jiashuang Huang Hengrong Ju Tianyi Zhou Weiping Ding Multimodal multi-instance evidence fusion neural networks for cancer survival prediction Scientific Reports Survival prediction Multimodal fusion Vision transformer Dempster–Shafer evidence theory |
| title | Multimodal multi-instance evidence fusion neural networks for cancer survival prediction |
| title_full | Multimodal multi-instance evidence fusion neural networks for cancer survival prediction |
| title_fullStr | Multimodal multi-instance evidence fusion neural networks for cancer survival prediction |
| title_full_unstemmed | Multimodal multi-instance evidence fusion neural networks for cancer survival prediction |
| title_short | Multimodal multi-instance evidence fusion neural networks for cancer survival prediction |
| title_sort | multimodal multi instance evidence fusion neural networks for cancer survival prediction |
| topic | Survival prediction Multimodal fusion Vision transformer Dempster–Shafer evidence theory |
| url | https://doi.org/10.1038/s41598-025-93770-3 |
| work_keys_str_mv | AT huiluo multimodalmultiinstanceevidencefusionneuralnetworksforcancersurvivalprediction AT jiashuanghuang multimodalmultiinstanceevidencefusionneuralnetworksforcancersurvivalprediction AT hengrongju multimodalmultiinstanceevidencefusionneuralnetworksforcancersurvivalprediction AT tianyizhou multimodalmultiinstanceevidencefusionneuralnetworksforcancersurvivalprediction AT weipingding multimodalmultiinstanceevidencefusionneuralnetworksforcancersurvivalprediction |