Showing 221 - 240 results of 9,539 for search '"attention"', query time: 0.06s Refine Results
  1. 221
  2. 222

    Caption Generation Based on Emotions Using CSPDenseNet and BiLSTM with Self-Attention by Kavi Priya S, Pon Karthika K, Jayakumar Kaliappan, Senthil Kumaran Selvaraj, Nagalakshmi R, Baye Molla

    Published 2022-01-01
    “…The decoding unit employs a self-attention mechanism encompassed with BiLSTM to create more descriptive and relevant captions in natural language. …”
    Get full text
    Article
  3. 223
  4. 224

    MultiChem: predicting chemical properties using multi-view graph attention network by Heesang Moon, Mina Rho

    Published 2025-01-01
    “…In our model, graph attention layers are employed to effectively capture essential local structures by jointly considering atom and bond features, while multi-head attention layers extract important global features. …”
    Get full text
    Article
  5. 225
  6. 226
  7. 227

    A listening advantage for native speech is reflected by attention-related activity in auditory cortex by Meng Liang, Johannes Gerwien, Alexander Gutschalk

    Published 2025-02-01
    “…Here we test the hypothesis that attentional enhancement in auditory cortex is stronger for native speech, using magnetoencephalography. …”
    Get full text
    Article
  8. 228

    CNFA: ConvNeXt Fusion Attention Module for Age Recognition of the Tangerine Peel by Fuqin Deng, Junwei Li, Lanhui Fu, Chuanbo Qin, Yikui Zhai, Hongmin Wang, Ningbo Yi, Nannan Li, TinLun Lam

    Published 2024-01-01
    “…This work investigates the automatic age recognition of the tangerine peel based on deep learning and attention mechanisms. We proposed an effective ConvNeXt fusion attention module (CNFA), which consists of three parts, a ConvNeXt block for extracting low-level features’ information and aggregating hierarchical features, a channel squeeze-and-excitation (cSE) block and a spatial squeeze-and-excitation (sSE) block for generating sufficient high-level feature information from both channel and spatial dimensions. …”
    Get full text
    Article
  9. 229
  10. 230

    Multi-channel spatio-temporal graph attention contrastive network for brain disease diagnosis by Chaojun Li, Kai Ma, Shengrong Li, Xiangshui Meng, Ran Wang, Daoqiang Zhang, Qi Zhu

    Published 2025-02-01
    “…Second, we develop a multi-channel spatial attention contrastive network to extract topological features from the brain network within each time window. …”
    Get full text
    Article
  11. 231

    An Effective Self-Attention-Based Hybrid Model for Short-Term Traffic Flow Prediction by Zhihong Li, Xiaoyu Wang, Kairan Yang

    Published 2023-01-01
    “…Our new hybrid model gives a higher accuracy than the support vector regression (SVR) model, LSTM neural network-attention (LSTM-attention) model, and temporal convolutional network (TCN) model. …”
    Get full text
    Article
  12. 232
  13. 233

    DMSS: An Attention-Based Deep Learning Model for High-Quality Mass Spectrometry Prediction by Yihui Ren, Yu Wang, Wenkai Han, Yikang Huang, Xiaoyang Hou, Chunming Zhang, Dongbo Bu, Xin Gao, Shiwei Sun

    Published 2024-09-01
    “…In this study, we introduce Deep MS Simulator (DMSS), a novel attention-based model tailored for forecasting theoretical spectra in mass spectrometry. …”
    Get full text
    Article
  14. 234
  15. 235

    Articulatory-to-Acoustic Conversion Using BiLSTM-CNN Word-Attention-Based Method by Guofeng Ren, Guicheng Shao, Jianmei Fu

    Published 2020-01-01
    “…By considering the graphical representation of the articulators’ motion, this study combined Bidirectional Long Short-Term Memory (BiLSTM) with convolution neural network (CNN) and adopted the idea of word attention in Mandarin to extract semantic features. …”
    Get full text
    Article
  16. 236
  17. 237
  18. 238
  19. 239
  20. 240