Neural-field-based image reconstruction for bioluminescence tomography

Deep learning (DL)-based image reconstruction methods have garnered increasing interest in the last few years. Numerous studies demonstrate that DL-based reconstruction methods function admirably in optical tomographic imaging techniques, such as bioluminescence tomography (BLT). Nevertheless, nearl...

Full description

Saved in:
Bibliographic Details
Main Authors: Xuanxuan Zhang, Xu Cao, Jiulou Zhang, Lin Zhang, Guanglei Zhang
Format: Article
Language:English
Published: World Scientific Publishing 2025-01-01
Series:Journal of Innovative Optical Health Sciences
Subjects:
Online Access:https://www.worldscientific.com/doi/10.1142/S1793545825500026
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep learning (DL)-based image reconstruction methods have garnered increasing interest in the last few years. Numerous studies demonstrate that DL-based reconstruction methods function admirably in optical tomographic imaging techniques, such as bioluminescence tomography (BLT). Nevertheless, nearly every existing DL-based method utilizes an explicit neural representation for the reconstruction problem, which either consumes much memory space or requires various complicated computations. In this paper, we present a neural field (NF)-based image reconstruction scheme for BLT that uses an implicit neural representation. The proposed NF-based method establishes a transformation between the coordinate of an arbitrary spatial point and the source value of the point with a relatively light-weight multilayer perceptron, which has remarkable computational efficiency. Another simple neural network composed of two fully connected layers and a 1D convolutional layer is used to generate the neural features. Results of simulations and experiments show that the proposed NF-based method has similar performance to the photon density complement network and the two-stage network, while consuming fewer floating point operations with fewer model parameters.
ISSN:1793-5458
1793-7205