A quality assessment algorithm for no-reference images based on transfer learning

Image quality assessment (IQA) plays a critical role in automatically detecting and correcting defects in images, thereby enhancing the overall performance of image processing and transmission systems. While research on reference-based IQA is well-established, studies on no-reference image IQA remai...

Full description

Saved in:
Bibliographic Details
Main Authors: Yang Yang, Chang Liu, Hui Wu, Dingguo Yu
Format: Article
Language:English
Published: PeerJ Inc. 2025-01-01
Series:PeerJ Computer Science
Subjects:
Online Access:https://peerj.com/articles/cs-2654.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Image quality assessment (IQA) plays a critical role in automatically detecting and correcting defects in images, thereby enhancing the overall performance of image processing and transmission systems. While research on reference-based IQA is well-established, studies on no-reference image IQA remain underdeveloped. In this article, we propose a novel no-reference IQA algorithm based on transfer learning (IQA-NRTL). This algorithm leverages a deep convolutional neural network (CNN) due to its ability to effectively capture multi-scale semantic information features, which are essential for representing the complex visual perception in images. These features are extracted through a visual perception module. Subsequently, an adaptive fusion network integrates these features, and a fully connected regression network correlates the fused semantic information with global semantic information to perform the final quality assessment. Experimental results on authentically distorted datasets (KonIQ-10k, BIQ2021), synthetically distorted datasets (LIVE, TID2013), and an artificial intelligence (AI)-generated content dataset (AGIQA-1K) show that the proposed IQA-NRTL algorithm significantly improves performance compared to mainstream no-reference IQA algorithms, depending on variations in image content and complexity.
ISSN:2376-5992