Chrysanthemum classification method integrating deep visual features from both the front and back sides

IntroducionChrysanthemum morifolium Ramat (hereinafter referred to as Chrysanthemum) is one of the most beloved and economically valuable Chinese herbal crops, which contains abundant medicinal ingredients and wide application prospects. Therefore, identifying the classification and origin of Chrysa...

Full description

Saved in:
Bibliographic Details
Main Authors: Yifan Chen, Xichen Yang, Hui Yan, Jia Liu, Jian Jiang, Zhongyuan Mao, Tianshu Wang
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-01-01
Series:Frontiers in Plant Science
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fpls.2024.1463113/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832592429456293888
author Yifan Chen
Xichen Yang
Hui Yan
Hui Yan
Jia Liu
Jian Jiang
Zhongyuan Mao
Tianshu Wang
Tianshu Wang
author_facet Yifan Chen
Xichen Yang
Hui Yan
Hui Yan
Jia Liu
Jian Jiang
Zhongyuan Mao
Tianshu Wang
Tianshu Wang
author_sort Yifan Chen
collection DOAJ
description IntroducionChrysanthemum morifolium Ramat (hereinafter referred to as Chrysanthemum) is one of the most beloved and economically valuable Chinese herbal crops, which contains abundant medicinal ingredients and wide application prospects. Therefore, identifying the classification and origin of Chrysanthemum is important for producers, consumers, and market regulators. The existing Chrysanthemum classification methods mostly rely on visual subjective identification, are time-consuming, and always need high equipment costs.MethodsA novel method is proposed to accurately identify the Chrysanthemum classification in a swift, non-invasive, and non-contact way. The proposed method is based on the fusion of deep visual features of both the front and back sides. Firstly, the different Chrysanthemums images are collected and labeled with origins and classifications. Secondly, the background area with less available information is removed by image preprocessing. Thirdly, a two-stream feature extraction network is designed with two inputs which are the preprocessed front and back Chrysanthemum images. Meanwhile, the incorporation of single-stream residual connections and cross-stream residual connections is employed to extend the receptive field of the network and fully fusion the features from both the front and back sides.ResultsExperimental results demonstrate that the proposed method achieves an accuracy of 93.8%, outperforming existing methods and exhibiting superior stability.DiscussionThe proposed method provides an effective and dependable solution for identifying Chrysanthemum classification and origin while offering practical benefits for quality assurance in production, consumer markets, and regulatory processes. Code and data are available at https://github.com/dart-into/CCMIFB.
format Article
id doaj-art-b311b1f9a81e42dd97393b47883814e2
institution Kabale University
issn 1664-462X
language English
publishDate 2025-01-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Plant Science
spelling doaj-art-b311b1f9a81e42dd97393b47883814e22025-01-21T08:36:48ZengFrontiers Media S.A.Frontiers in Plant Science1664-462X2025-01-011510.3389/fpls.2024.14631131463113Chrysanthemum classification method integrating deep visual features from both the front and back sidesYifan Chen0Xichen Yang1Hui Yan2Hui Yan3Jia Liu4Jian Jiang5Zhongyuan Mao6Tianshu Wang7Tianshu Wang8School of Computer and Electronic Information/School of Artificial Intelligence, Nanjing Normal University, Nanjing, Jiangsu, ChinaSchool of Computer and Electronic Information/School of Artificial Intelligence, Nanjing Normal University, Nanjing, Jiangsu, ChinaNanjing University of Chinese Medicine, National and Local Collaborative Engineering Center of Chinese Medicinal Resources Industrialization and Formulae Innovative Medicine, Nanjing, ChinaJiangsu Collaborative Innovation Center of Chinese Medicinal Resources Industrialization, Nanjing University of Chinese Medicine, Nanjing, Jiangsu, ChinaCollege of Artificial Intelligence and Information Technology, Nanjing University of Chinese Medicine, Nanjing, Jiangsu, ChinaSchool of Computer and Electronic Information/School of Artificial Intelligence, Nanjing Normal University, Nanjing, Jiangsu, ChinaSchool of Computer and Electronic Information/School of Artificial Intelligence, Nanjing Normal University, Nanjing, Jiangsu, ChinaCollege of Artificial Intelligence and Information Technology, Nanjing University of Chinese Medicine, Nanjing, Jiangsu, ChinaJiangsu Province Engineering Research Center of Traditional Chinese Medicine (TCM) Intelligence Health Service, Nanjing University of Chinese Medicine, Nanjing, Jiangsu, ChinaIntroducionChrysanthemum morifolium Ramat (hereinafter referred to as Chrysanthemum) is one of the most beloved and economically valuable Chinese herbal crops, which contains abundant medicinal ingredients and wide application prospects. Therefore, identifying the classification and origin of Chrysanthemum is important for producers, consumers, and market regulators. The existing Chrysanthemum classification methods mostly rely on visual subjective identification, are time-consuming, and always need high equipment costs.MethodsA novel method is proposed to accurately identify the Chrysanthemum classification in a swift, non-invasive, and non-contact way. The proposed method is based on the fusion of deep visual features of both the front and back sides. Firstly, the different Chrysanthemums images are collected and labeled with origins and classifications. Secondly, the background area with less available information is removed by image preprocessing. Thirdly, a two-stream feature extraction network is designed with two inputs which are the preprocessed front and back Chrysanthemum images. Meanwhile, the incorporation of single-stream residual connections and cross-stream residual connections is employed to extend the receptive field of the network and fully fusion the features from both the front and back sides.ResultsExperimental results demonstrate that the proposed method achieves an accuracy of 93.8%, outperforming existing methods and exhibiting superior stability.DiscussionThe proposed method provides an effective and dependable solution for identifying Chrysanthemum classification and origin while offering practical benefits for quality assurance in production, consumer markets, and regulatory processes. Code and data are available at https://github.com/dart-into/CCMIFB.https://www.frontiersin.org/articles/10.3389/fpls.2024.1463113/fullChrysanthemum classificationtwo-stream networkvisual informationfeature fusiondeep learning
spellingShingle Yifan Chen
Xichen Yang
Hui Yan
Hui Yan
Jia Liu
Jian Jiang
Zhongyuan Mao
Tianshu Wang
Tianshu Wang
Chrysanthemum classification method integrating deep visual features from both the front and back sides
Frontiers in Plant Science
Chrysanthemum classification
two-stream network
visual information
feature fusion
deep learning
title Chrysanthemum classification method integrating deep visual features from both the front and back sides
title_full Chrysanthemum classification method integrating deep visual features from both the front and back sides
title_fullStr Chrysanthemum classification method integrating deep visual features from both the front and back sides
title_full_unstemmed Chrysanthemum classification method integrating deep visual features from both the front and back sides
title_short Chrysanthemum classification method integrating deep visual features from both the front and back sides
title_sort chrysanthemum classification method integrating deep visual features from both the front and back sides
topic Chrysanthemum classification
two-stream network
visual information
feature fusion
deep learning
url https://www.frontiersin.org/articles/10.3389/fpls.2024.1463113/full
work_keys_str_mv AT yifanchen chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides
AT xichenyang chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides
AT huiyan chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides
AT huiyan chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides
AT jialiu chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides
AT jianjiang chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides
AT zhongyuanmao chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides
AT tianshuwang chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides
AT tianshuwang chrysanthemumclassificationmethodintegratingdeepvisualfeaturesfromboththefrontandbacksides