VCAFusion: A Framework for Infrared and Low Light Visible Image Fusion Based on Visual Characteristics Adjustment

Infrared (IR) and visible (VIS) image fusion enhances vision tasks by combining complementary data. However, most existing methods assume normal lighting conditions and thus perform poorly in low-light environments, where VIS images often lose critical texture details. To address this limitation, we...

Full description

Saved in:
Bibliographic Details
Main Authors: Jiawen Li, Zhengzhong Huang, Jiapin Peng, Xiaochuan Zhang, Rongzhu Zhang
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/11/6295
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Infrared (IR) and visible (VIS) image fusion enhances vision tasks by combining complementary data. However, most existing methods assume normal lighting conditions and thus perform poorly in low-light environments, where VIS images often lose critical texture details. To address this limitation, we propose VCAFusion, a novel approach for robust infrared and visible image fusion in low-light scenarios. Our framework incorporates an adaptive brightness adjustment model based on light reflection theory to mitigate illumination-induced degradation in nocturnal images. Additionally, we design an adaptive enhancement function inspired by human visual perception to recover weak texture details. To further improve fusion quality, we develop an edge-preserving multi-scale decomposition model and a saliency-preserving strategy, ensuring seamless integration of perceptual features. By effectively balancing low-light enhancement and fusion, our framework preserves both the intensity distribution and the fine texture details of salient objects. Extensive experiments on public datasets demonstrate that VCAFusion achieves superior fusion quality, closely aligning with human visual perception and outperforming state-of-the-art methods in both qualitative and quantitative evaluations.
ISSN:2076-3417