Differentiable Few-view CT-Reconstruction for Arbitrary CT-Trajectories including Prior Knowledge

Computed tomography (CT) is widely used in non-destructive testing (NDT), but the increasing flexibility of robot-based CT systems often results in more sparse and unevenly distributed projection data. This sparsity introduces significant challenges in reconstructing high-quality images. This paper...

Full description

Saved in:
Bibliographic Details
Main Authors: Linda-Sophie Schneider, Adrian Waldyra, Yipeng Sun, Andreas K. Maier
Format: Article
Language:deu
Published: NDT.net 2025-02-01
Series:e-Journal of Nondestructive Testing
Online Access:https://www.ndt.net/search/docs.php3?id=30724
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Computed tomography (CT) is widely used in non-destructive testing (NDT), but the increasing flexibility of robot-based CT systems often results in more sparse and unevenly distributed projection data. This sparsity introduces significant challenges in reconstructing high-quality images. This paper presents a novel two-step pipeline for few-view CT reconstruction that combines discrete prior generation with differentiable optimization. First, the Discrete Algebraic Reconstruction Technique generates a binary volume that provides robust prior information about the object’s structure. This prior is then integrated into a fully differentiable reconstruction framework through two distinct strategies: gradient update cropping, which focuses optimization on regions identified by the prior, and prior-informed initialization, which uses the binary volume to create an informed starting point. Together, these approaches guide the iterative refinement of the reconstruction using known operator learning. Experiments on real-world datasets demonstrate the efficacy of the approach. Compared to conventional methods, the proposed framework achieves significant improvements in reconstruction quality. The results highlight the method’s ability to leverage sparse projection data, providing high-quality reconstructions even in challenging industrial scenarios.
ISSN:1435-4934