Manual Acupuncture Manipulation Recognition Method via Interactive Fusion of Spatial Multiscale Motion Features

Manual acupuncture manipulation (MAM) is essential in traditional Chinese medicine treatment. MAM action recognition is important for junior acupuncturists’ training and education; however, there are obvious personalized differences in hand gestures among expert acupuncturists for the same type of M...

Full description

Saved in:
Bibliographic Details
Main Authors: Jiyu He, Chong Su, Jie Chen, Jinniu Li, Jingwen Yang, Cunzhi Liu
Format: Article
Language:English
Published: Wiley 2024-01-01
Series:IET Signal Processing
Online Access:http://dx.doi.org/10.1049/2024/2124139
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849685560647483392
author Jiyu He
Chong Su
Jie Chen
Jinniu Li
Jingwen Yang
Cunzhi Liu
author_facet Jiyu He
Chong Su
Jie Chen
Jinniu Li
Jingwen Yang
Cunzhi Liu
author_sort Jiyu He
collection DOAJ
description Manual acupuncture manipulation (MAM) is essential in traditional Chinese medicine treatment. MAM action recognition is important for junior acupuncturists’ training and education; however, there are obvious personalized differences in hand gestures among expert acupuncturists for the same type of MAM. In addition, during the MAM operations, the magnitude and frequency of the expert acupuncturists’ hand shape and relative needle-holding finger position changes are tiny and fast, resulting in difficulties in observing MAM action details. Thus, we propose a Spatial Multiscale Interactive Fusion MAM Recognition Network to solve the difficulties in MAM recognition. First, this paper presents an optical flow-based hand motion contour global feature extraction method for acupuncture hand shape. Second, to explore the motion rule between the needle-holding fingers during the MAM operations, we design a quantitative description method of the relative motion of the needle-holding fingers: an “interactive attention module,” which achieves feature fusion and mines the correlation between different scales of MAM action features. Finally, the proposed MAM recognition method was validated by 20 acupuncturists from the Beijing University of Traditional Chinese Medicine and 10 from the Beijing Zhongguancun Hospital who participated in the MAM video signal collection. The proposed recognition method achieves the highest average validation accuracy of 95.3% and the highest test accuracy of 96.0% for four typical MAMs, proving its feasibility and effectiveness.
format Article
id doaj-art-c4bf7909d2bf4b4f88adccb2bcb0b890
institution DOAJ
issn 1751-9683
language English
publishDate 2024-01-01
publisher Wiley
record_format Article
series IET Signal Processing
spelling doaj-art-c4bf7909d2bf4b4f88adccb2bcb0b8902025-08-20T03:23:06ZengWileyIET Signal Processing1751-96832024-01-01202410.1049/2024/2124139Manual Acupuncture Manipulation Recognition Method via Interactive Fusion of Spatial Multiscale Motion FeaturesJiyu He0Chong Su1Jie Chen2Jinniu Li3Jingwen Yang4Cunzhi Liu5College of Information Science and TechnologyCollege of Information Science and TechnologyDepartment of Traditional Chinese MedicineDepartment of Traditional Chinese MedicineCollege of Acupuncture-Moxibustion and TuinaCollege of Acupuncture-Moxibustion and TuinaManual acupuncture manipulation (MAM) is essential in traditional Chinese medicine treatment. MAM action recognition is important for junior acupuncturists’ training and education; however, there are obvious personalized differences in hand gestures among expert acupuncturists for the same type of MAM. In addition, during the MAM operations, the magnitude and frequency of the expert acupuncturists’ hand shape and relative needle-holding finger position changes are tiny and fast, resulting in difficulties in observing MAM action details. Thus, we propose a Spatial Multiscale Interactive Fusion MAM Recognition Network to solve the difficulties in MAM recognition. First, this paper presents an optical flow-based hand motion contour global feature extraction method for acupuncture hand shape. Second, to explore the motion rule between the needle-holding fingers during the MAM operations, we design a quantitative description method of the relative motion of the needle-holding fingers: an “interactive attention module,” which achieves feature fusion and mines the correlation between different scales of MAM action features. Finally, the proposed MAM recognition method was validated by 20 acupuncturists from the Beijing University of Traditional Chinese Medicine and 10 from the Beijing Zhongguancun Hospital who participated in the MAM video signal collection. The proposed recognition method achieves the highest average validation accuracy of 95.3% and the highest test accuracy of 96.0% for four typical MAMs, proving its feasibility and effectiveness.http://dx.doi.org/10.1049/2024/2124139
spellingShingle Jiyu He
Chong Su
Jie Chen
Jinniu Li
Jingwen Yang
Cunzhi Liu
Manual Acupuncture Manipulation Recognition Method via Interactive Fusion of Spatial Multiscale Motion Features
IET Signal Processing
title Manual Acupuncture Manipulation Recognition Method via Interactive Fusion of Spatial Multiscale Motion Features
title_full Manual Acupuncture Manipulation Recognition Method via Interactive Fusion of Spatial Multiscale Motion Features
title_fullStr Manual Acupuncture Manipulation Recognition Method via Interactive Fusion of Spatial Multiscale Motion Features
title_full_unstemmed Manual Acupuncture Manipulation Recognition Method via Interactive Fusion of Spatial Multiscale Motion Features
title_short Manual Acupuncture Manipulation Recognition Method via Interactive Fusion of Spatial Multiscale Motion Features
title_sort manual acupuncture manipulation recognition method via interactive fusion of spatial multiscale motion features
url http://dx.doi.org/10.1049/2024/2124139
work_keys_str_mv AT jiyuhe manualacupuncturemanipulationrecognitionmethodviainteractivefusionofspatialmultiscalemotionfeatures
AT chongsu manualacupuncturemanipulationrecognitionmethodviainteractivefusionofspatialmultiscalemotionfeatures
AT jiechen manualacupuncturemanipulationrecognitionmethodviainteractivefusionofspatialmultiscalemotionfeatures
AT jinniuli manualacupuncturemanipulationrecognitionmethodviainteractivefusionofspatialmultiscalemotionfeatures
AT jingwenyang manualacupuncturemanipulationrecognitionmethodviainteractivefusionofspatialmultiscalemotionfeatures
AT cunzhiliu manualacupuncturemanipulationrecognitionmethodviainteractivefusionofspatialmultiscalemotionfeatures