Compensating CBCT Motion Artifacts with Any 2D Generative Model
This paper presents a novel approach to mitigate motion artifacts in industrial Cone-Beam Computed Tomography (CBCT) caused by detector or X-ray source jitter due to mechanical vibration. Leveraging two-dimensional (2D) generative models while ensuring consistency between adjacent CBCT slices, our...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | deu |
Published: |
NDT.net
2025-02-01
|
Series: | e-Journal of Nondestructive Testing |
Online Access: | https://www.ndt.net/search/docs.php3?id=30726 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832086570829611008 |
---|---|
author | Yipeng Sun Linda-Sophie Schneider Mingxuan Gu Siyuan Mei Siming Bayer Andreas K. Maier |
author_facet | Yipeng Sun Linda-Sophie Schneider Mingxuan Gu Siyuan Mei Siming Bayer Andreas K. Maier |
author_sort | Yipeng Sun |
collection | DOAJ |
description |
This paper presents a novel approach to mitigate motion artifacts in industrial Cone-Beam Computed Tomography (CBCT) caused by detector or X-ray source jitter due to mechanical vibration. Leveraging two-dimensional (2D) generative models while ensuring consistency between adjacent CBCT slices, our method addresses the limitations of traditional deep learning approaches that process each 2D slice of a three-dimensional (3D) volume independently. While traditional deep learning approaches may adequately handle artifacts in the axial view, they often struggle with consistency problems in sagittal and coronal views, resulting in insufficient 3D coherence across the entire volume. Our approach integrates a 2D generative model for artifact reduction with a structure-aware regularization strategy, specifically employing a gradient-based smoothness constraint along the through-plane direction to maintain smooth transitions between slices and preserve the overall 3D structure. This model-independent framework accommodates various 2D architectures while effectively addressing the limitations of treating 3D volumes as independent 2D slices. Experimental results on simulated CBCT datasets demonstrate significant improvements in image quality metrics and artifact reduction, offering a promising solution for enhancing 3D CT reconstruction in industrial applications.
|
format | Article |
id | doaj-art-5039669bf1ac4cc3a8f3c81c5fe5b9f6 |
institution | Kabale University |
issn | 1435-4934 |
language | deu |
publishDate | 2025-02-01 |
publisher | NDT.net |
record_format | Article |
series | e-Journal of Nondestructive Testing |
spelling | doaj-art-5039669bf1ac4cc3a8f3c81c5fe5b9f62025-02-06T10:48:19ZdeuNDT.nete-Journal of Nondestructive Testing1435-49342025-02-0130210.58286/30726Compensating CBCT Motion Artifacts with Any 2D Generative ModelYipeng SunLinda-Sophie SchneiderMingxuan GuSiyuan MeiSiming BayerAndreas K. Maier This paper presents a novel approach to mitigate motion artifacts in industrial Cone-Beam Computed Tomography (CBCT) caused by detector or X-ray source jitter due to mechanical vibration. Leveraging two-dimensional (2D) generative models while ensuring consistency between adjacent CBCT slices, our method addresses the limitations of traditional deep learning approaches that process each 2D slice of a three-dimensional (3D) volume independently. While traditional deep learning approaches may adequately handle artifacts in the axial view, they often struggle with consistency problems in sagittal and coronal views, resulting in insufficient 3D coherence across the entire volume. Our approach integrates a 2D generative model for artifact reduction with a structure-aware regularization strategy, specifically employing a gradient-based smoothness constraint along the through-plane direction to maintain smooth transitions between slices and preserve the overall 3D structure. This model-independent framework accommodates various 2D architectures while effectively addressing the limitations of treating 3D volumes as independent 2D slices. Experimental results on simulated CBCT datasets demonstrate significant improvements in image quality metrics and artifact reduction, offering a promising solution for enhancing 3D CT reconstruction in industrial applications. https://www.ndt.net/search/docs.php3?id=30726 |
spellingShingle | Yipeng Sun Linda-Sophie Schneider Mingxuan Gu Siyuan Mei Siming Bayer Andreas K. Maier Compensating CBCT Motion Artifacts with Any 2D Generative Model e-Journal of Nondestructive Testing |
title | Compensating CBCT Motion Artifacts with Any 2D Generative Model |
title_full | Compensating CBCT Motion Artifacts with Any 2D Generative Model |
title_fullStr | Compensating CBCT Motion Artifacts with Any 2D Generative Model |
title_full_unstemmed | Compensating CBCT Motion Artifacts with Any 2D Generative Model |
title_short | Compensating CBCT Motion Artifacts with Any 2D Generative Model |
title_sort | compensating cbct motion artifacts with any 2d generative model |
url | https://www.ndt.net/search/docs.php3?id=30726 |
work_keys_str_mv | AT yipengsun compensatingcbctmotionartifactswithany2dgenerativemodel AT lindasophieschneider compensatingcbctmotionartifactswithany2dgenerativemodel AT mingxuangu compensatingcbctmotionartifactswithany2dgenerativemodel AT siyuanmei compensatingcbctmotionartifactswithany2dgenerativemodel AT simingbayer compensatingcbctmotionartifactswithany2dgenerativemodel AT andreaskmaier compensatingcbctmotionartifactswithany2dgenerativemodel |