Multilevel Attention and Multiscale Feature Fusion Network for Author Classification of Chinese Ink-Wash Paintings

How to effectively extract features with high representation ability has always been a research topic and a challenge for classification tasks. Most of the existing methods mainly solve the problem by using deep convolutional neural networks as feature extractors. Although a series of excellent netw...

Full description

Saved in:
Bibliographic Details
Main Authors: Wei Jiang, Xianglian Meng, Ji Xi
Format: Article
Language:English
Published: Wiley 2022-01-01
Series:Discrete Dynamics in Nature and Society
Online Access:http://dx.doi.org/10.1155/2022/9188356
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832549619004866560
author Wei Jiang
Xianglian Meng
Ji Xi
author_facet Wei Jiang
Xianglian Meng
Ji Xi
author_sort Wei Jiang
collection DOAJ
description How to effectively extract features with high representation ability has always been a research topic and a challenge for classification tasks. Most of the existing methods mainly solve the problem by using deep convolutional neural networks as feature extractors. Although a series of excellent network structures have been successful in the field of Chinese ink-wash painting classification, but most of them adopted the methods of only simple augmentation of the network structures and direct fusion of different scale features, which limit the network to further extract semantically rich and scale-invariant feature information, thus hindering the improvement of classification performance. In this paper, a novel model based on multi-level attention and multi-scale feature fusion is proposed. The model extracts three types of feature maps from the low-level, middle-level and high-level layers of the pretrained deep neural network firstly. Then, the low-level and middle-level feature maps are processed by the spatial attention module, nevertheless the high-level feature maps are processed by the scale invariance module to increase the scale-invariance properties. Moreover, the conditional random field module is adopted to fuse the optimized three-scale feature maps, and the channel attention module is followed to refine the features. Finally, the multi-level deep supervision strategy is utilized to optimize the model for better performance. To verify the effectiveness of the model, extensive experimental results on the Chinese ink-wash painting dataset created in this work show that the classification performance of the model is better than other mainstream research methods.
format Article
id doaj-art-df7273ed830a4001843caeb4c4682bbc
institution Kabale University
issn 1607-887X
language English
publishDate 2022-01-01
publisher Wiley
record_format Article
series Discrete Dynamics in Nature and Society
spelling doaj-art-df7273ed830a4001843caeb4c4682bbc2025-02-03T06:10:56ZengWileyDiscrete Dynamics in Nature and Society1607-887X2022-01-01202210.1155/2022/9188356Multilevel Attention and Multiscale Feature Fusion Network for Author Classification of Chinese Ink-Wash PaintingsWei Jiang0Xianglian Meng1Ji Xi2School of Computer Information and EngineeringSchool of Computer Information and EngineeringSchool of Computer Information and EngineeringHow to effectively extract features with high representation ability has always been a research topic and a challenge for classification tasks. Most of the existing methods mainly solve the problem by using deep convolutional neural networks as feature extractors. Although a series of excellent network structures have been successful in the field of Chinese ink-wash painting classification, but most of them adopted the methods of only simple augmentation of the network structures and direct fusion of different scale features, which limit the network to further extract semantically rich and scale-invariant feature information, thus hindering the improvement of classification performance. In this paper, a novel model based on multi-level attention and multi-scale feature fusion is proposed. The model extracts three types of feature maps from the low-level, middle-level and high-level layers of the pretrained deep neural network firstly. Then, the low-level and middle-level feature maps are processed by the spatial attention module, nevertheless the high-level feature maps are processed by the scale invariance module to increase the scale-invariance properties. Moreover, the conditional random field module is adopted to fuse the optimized three-scale feature maps, and the channel attention module is followed to refine the features. Finally, the multi-level deep supervision strategy is utilized to optimize the model for better performance. To verify the effectiveness of the model, extensive experimental results on the Chinese ink-wash painting dataset created in this work show that the classification performance of the model is better than other mainstream research methods.http://dx.doi.org/10.1155/2022/9188356
spellingShingle Wei Jiang
Xianglian Meng
Ji Xi
Multilevel Attention and Multiscale Feature Fusion Network for Author Classification of Chinese Ink-Wash Paintings
Discrete Dynamics in Nature and Society
title Multilevel Attention and Multiscale Feature Fusion Network for Author Classification of Chinese Ink-Wash Paintings
title_full Multilevel Attention and Multiscale Feature Fusion Network for Author Classification of Chinese Ink-Wash Paintings
title_fullStr Multilevel Attention and Multiscale Feature Fusion Network for Author Classification of Chinese Ink-Wash Paintings
title_full_unstemmed Multilevel Attention and Multiscale Feature Fusion Network for Author Classification of Chinese Ink-Wash Paintings
title_short Multilevel Attention and Multiscale Feature Fusion Network for Author Classification of Chinese Ink-Wash Paintings
title_sort multilevel attention and multiscale feature fusion network for author classification of chinese ink wash paintings
url http://dx.doi.org/10.1155/2022/9188356
work_keys_str_mv AT weijiang multilevelattentionandmultiscalefeaturefusionnetworkforauthorclassificationofchineseinkwashpaintings
AT xianglianmeng multilevelattentionandmultiscalefeaturefusionnetworkforauthorclassificationofchineseinkwashpaintings
AT jixi multilevelattentionandmultiscalefeaturefusionnetworkforauthorclassificationofchineseinkwashpaintings