IRSnet: An Implicit Residual Solver and Its Unfolding Neural Network With 0.003M Parameters for Total Variation Models
Solving total variation problems is fundamentally important for many computer vision tasks, such as image smoothing, optical flow estimation and 3D surface reconstruction. However, the traditional iterative solvers require a large number of iterations to converge, while deep learning solvers have a...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10838572/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Solving total variation problems is fundamentally important for many computer vision tasks, such as image smoothing, optical flow estimation and 3D surface reconstruction. However, the traditional iterative solvers require a large number of iterations to converge, while deep learning solvers have a huge number of parameters, hampering their practical deployment. To address these issues, this paper first introduces a novel iterative algorithm that is 6 ~ 75 times faster than previous iterative methods. The proposed iterative method converges and converges to the optimal solution. These two facts are theoretically guaranteed and numerically confirmed, respectively. Then, we generalize this algorithm to a compact implicit neural network that has only 0.003M parameters. The network is shown to be more effective and efficient. Thanks to the small number of parameters, the proposed network can be applied in a wide range of applications where total variation is imposed. The source code for the iterative solver and the neural network is publicly available at <uri>https://github.com/gyh8/IRS</uri>. |
---|---|
ISSN: | 2169-3536 |