An Automated Marker-Less Registration Approach Using Neural Radiance Fields for Potential Use in Mixed Reality-Based Computer-Aided Surgical Navigation of Paranasal Sinus

Paranasal sinus surgery, a common treatment for chronic rhinosinusitis, requires exceptional precision due to the proximity of critical anatomical structures. To ensure accurate instrument control and clear visualization of the surgical site, surgeons utilize computer-aided surgical navigation (CSN)...

Full description

Saved in:
Bibliographic Details
Main Authors: Suhyeon Kim, Hyeonji Kim, Younhyun Jung
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:Computers
Subjects:
Online Access:https://www.mdpi.com/2073-431X/14/1/5
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Paranasal sinus surgery, a common treatment for chronic rhinosinusitis, requires exceptional precision due to the proximity of critical anatomical structures. To ensure accurate instrument control and clear visualization of the surgical site, surgeons utilize computer-aided surgical navigation (CSN). A key component of CSN is the registration process, which is traditionally reliant on manual or marker-based techniques. However, there is a growing shift toward marker-less registration methods. In previous work, we investigated a mesh-based registration approach using a Mixed Reality Head-Mounted Display (MR-HMD), specifically the Microsoft HoloLens 2. However, this method faced limitations, including depth holes and invalid values. These issues stemmed from the device’s low-resolution camera specifications and the 3D projection steps required to upscale RGB camera spaces. In this study, we propose a novel automated marker-less registration method leveraging Neural Radiance Field (NeRF) technology with an MR-HMD. To address insufficient depth information in the previous approach, we utilize rendered-depth images generated by the trained NeRF model. We evaluated our method against two other techniques, including prior mesh-based registration, using a facial phantom and three participants. The results demonstrate our proposed method achieves at least a 0.873 mm (12%) improvement in registration accuracy compared to others.
ISSN:2073-431X