MRF-Mixer: A Simulation-Based Deep Learning Framework for Accelerated and Accurate Magnetic Resonance Fingerprinting Reconstruction

MRF-Mixer is a novel deep learning method for magnetic resonance fingerprinting (MRF) reconstruction, offering 200× faster processing (0.35 s on CPU and 0.3 ms on GPU) and 40% higher accuracy (lower MAE) than dictionary matching. It develops a simulation-driven approach using complex-valued multi-la...

Full description

Saved in:
Bibliographic Details
Main Authors: Tianyi Ding, Yang Gao, Zhuang Xiong, Feng Liu, Martijn A. Cloos, Hongfu Sun
Format: Article
Language:English
Published: MDPI AG 2025-03-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/16/3/218
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:MRF-Mixer is a novel deep learning method for magnetic resonance fingerprinting (MRF) reconstruction, offering 200× faster processing (0.35 s on CPU and 0.3 ms on GPU) and 40% higher accuracy (lower MAE) than dictionary matching. It develops a simulation-driven approach using complex-valued multi-layer perceptrons and convolutional neural networks to efficiently process MRF data, enabling generalization across sequence and acquisition parameters and eliminating the need for extensive in vivo training data. Evaluation on simulated and in vivo data showed that MRF-Mixer outperforms dictionary matching and existing deep learning methods for T1 and T2 mapping. In six-shot simulations, it achieved the highest PSNR (T1: 33.48, T2: 35.9) and SSIM (T1: 0.98, T2: 0.98) and the lowest MAE (T1: 28.8, T2: 4.97) and RMSE (T1: 72.9, T2: 13.67). In vivo results further demonstrate that single-shot reconstructions using MRF-Mixer matched the quality of multi-shot acquisitions, highlighting its potential to reduce scan times. These findings suggest that MRF-Mixer enables faster, more accurate multiparametric tissue mapping, substantially improving quantitative MRI for clinical applications by reducing acquisition time while maintaining imaging quality.
ISSN:2078-2489