Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention
Artificial intelligence, with its remarkable adaptability, has gradually integrated into daily life. The emergence of the self-attention mechanism has propelled the Transformer architecture into diverse fields, including a role as an efficient and precise diagnostic and predictive tool in medicine....
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Biomimetics |
Subjects: | |
Online Access: | https://www.mdpi.com/2313-7673/10/1/34 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832589009507844096 |
---|---|
author | Zeyu Zhang Bin Li Chenyang Yan Kengo Furuichi Yuki Todo |
author_facet | Zeyu Zhang Bin Li Chenyang Yan Kengo Furuichi Yuki Todo |
author_sort | Zeyu Zhang |
collection | DOAJ |
description | Artificial intelligence, with its remarkable adaptability, has gradually integrated into daily life. The emergence of the self-attention mechanism has propelled the Transformer architecture into diverse fields, including a role as an efficient and precise diagnostic and predictive tool in medicine. To enhance accuracy, we propose the Double-Attention (DA) method, which improves the neural network’s biomimetic performance of human attention. By incorporating matrices generated from shifted images into the self-attention mechanism, the network gains the ability to preemptively acquire information from surrounding regions. Experimental results demonstrate the superior performance of our approaches across various benchmark datasets, validating their effectiveness. Furthermore, the method was applied to patient kidney datasets collected from hospitals for diabetes diagnosis, where they achieved high accuracy with significantly reduced computational demands. This advancement showcases the potential of our methods in the field of biomimetics, aligning well with the goals of developing innovative bioinspired diagnostic tools. |
format | Article |
id | doaj-art-cfb64b4c7dba42e99611966c889a6959 |
institution | Kabale University |
issn | 2313-7673 |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Biomimetics |
spelling | doaj-art-cfb64b4c7dba42e99611966c889a69592025-01-24T13:24:40ZengMDPI AGBiomimetics2313-76732025-01-011013410.3390/biomimetics10010034Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human AttentionZeyu Zhang0Bin Li1Chenyang Yan2Kengo Furuichi3Yuki Todo4Division of Electrical Engineering and Computer Science, Kanazawa University, Kanazawa 9201192, JapanDivision of Electrical Engineering and Computer Science, Kanazawa University, Kanazawa 9201192, JapanDivision of Electrical Engineering and Computer Science, Kanazawa University, Kanazawa 9201192, JapanDepartment of Nephrology, Kanazawa Medical University, Kahoku 9200293, JapanFaculty of Electrical, Information and Communication Engineering, Kanazawa University, Kanazawa 9201192, JapanArtificial intelligence, with its remarkable adaptability, has gradually integrated into daily life. The emergence of the self-attention mechanism has propelled the Transformer architecture into diverse fields, including a role as an efficient and precise diagnostic and predictive tool in medicine. To enhance accuracy, we propose the Double-Attention (DA) method, which improves the neural network’s biomimetic performance of human attention. By incorporating matrices generated from shifted images into the self-attention mechanism, the network gains the ability to preemptively acquire information from surrounding regions. Experimental results demonstrate the superior performance of our approaches across various benchmark datasets, validating their effectiveness. Furthermore, the method was applied to patient kidney datasets collected from hospitals for diabetes diagnosis, where they achieved high accuracy with significantly reduced computational demands. This advancement showcases the potential of our methods in the field of biomimetics, aligning well with the goals of developing innovative bioinspired diagnostic tools.https://www.mdpi.com/2313-7673/10/1/34self-attentionhuman attentiondeep learningshifted windowmedical image |
spellingShingle | Zeyu Zhang Bin Li Chenyang Yan Kengo Furuichi Yuki Todo Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention Biomimetics self-attention human attention deep learning shifted window medical image |
title | Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention |
title_full | Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention |
title_fullStr | Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention |
title_full_unstemmed | Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention |
title_short | Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention |
title_sort | double attention an optimization method for the self attention mechanism based on human attention |
topic | self-attention human attention deep learning shifted window medical image |
url | https://www.mdpi.com/2313-7673/10/1/34 |
work_keys_str_mv | AT zeyuzhang doubleattentionanoptimizationmethodfortheselfattentionmechanismbasedonhumanattention AT binli doubleattentionanoptimizationmethodfortheselfattentionmechanismbasedonhumanattention AT chenyangyan doubleattentionanoptimizationmethodfortheselfattentionmechanismbasedonhumanattention AT kengofuruichi doubleattentionanoptimizationmethodfortheselfattentionmechanismbasedonhumanattention AT yukitodo doubleattentionanoptimizationmethodfortheselfattentionmechanismbasedonhumanattention |