Showing 321 - 340 results of 13,183 for search '"attention"', query time: 0.08s Refine Results
  1. 321

    Seeing without the Occipito-Parietal Cortex: Simultagnosia as a Shrinkage of the Attentional Visual Field by François Michel, Marie-Anne Henaff

    Published 2004-01-01
    “…An experimental approach with tasks testing visuo-spatial attention demonstrated a shrinkage of A.T.’s attentional visual field. …”
    Get full text
    Article
  2. 322
  3. 323
  4. 324
  5. 325

    Caption Generation Based on Emotions Using CSPDenseNet and BiLSTM with Self-Attention by Kavi Priya S, Pon Karthika K, Jayakumar Kaliappan, Senthil Kumaran Selvaraj, Nagalakshmi R, Baye Molla

    Published 2022-01-01
    “…The decoding unit employs a self-attention mechanism encompassed with BiLSTM to create more descriptive and relevant captions in natural language. …”
    Get full text
    Article
  6. 326
  7. 327

    MultiChem: predicting chemical properties using multi-view graph attention network by Heesang Moon, Mina Rho

    Published 2025-01-01
    “…In our model, graph attention layers are employed to effectively capture essential local structures by jointly considering atom and bond features, while multi-head attention layers extract important global features. …”
    Get full text
    Article
  8. 328
  9. 329
  10. 330
  11. 331

    Energy consumption prediction using modified deep CNN-Bi LSTM with attention mechanism by Adel Binbusayyis, Mohemmed Sha

    Published 2025-01-01
    “…Deep CNN extracts features impacting energy consumption whereas Bi-LSTM with attention layer finds suitability for regression as it is capable of modelling irregular trends in the time-series components, where the attention mechanism is implemented to enhance the decoder's ability to selectively focus on the most relevant segments of the input sequence. …”
    Get full text
    Article
  12. 332
  13. 333

    A listening advantage for native speech is reflected by attention-related activity in auditory cortex by Meng Liang, Johannes Gerwien, Alexander Gutschalk

    Published 2025-02-01
    “…Here we test the hypothesis that attentional enhancement in auditory cortex is stronger for native speech, using magnetoencephalography. …”
    Get full text
    Article
  14. 334

    CNFA: ConvNeXt Fusion Attention Module for Age Recognition of the Tangerine Peel by Fuqin Deng, Junwei Li, Lanhui Fu, Chuanbo Qin, Yikui Zhai, Hongmin Wang, Ningbo Yi, Nannan Li, TinLun Lam

    Published 2024-01-01
    “…This work investigates the automatic age recognition of the tangerine peel based on deep learning and attention mechanisms. We proposed an effective ConvNeXt fusion attention module (CNFA), which consists of three parts, a ConvNeXt block for extracting low-level features’ information and aggregating hierarchical features, a channel squeeze-and-excitation (cSE) block and a spatial squeeze-and-excitation (sSE) block for generating sufficient high-level feature information from both channel and spatial dimensions. …”
    Get full text
    Article
  15. 335
  16. 336

    Multi-channel spatio-temporal graph attention contrastive network for brain disease diagnosis by Chaojun Li, Kai Ma, Shengrong Li, Xiangshui Meng, Ran Wang, Daoqiang Zhang, Qi Zhu

    Published 2025-02-01
    “…Second, we develop a multi-channel spatial attention contrastive network to extract topological features from the brain network within each time window. …”
    Get full text
    Article
  17. 337

    Multiscale attention network via topology learning for cerebral vessel segmentation in angiography images by Tao Han, Junchen Xiong, Tingyi Lin, Tao An, Cheng Wang, Jianjun Zhu, Zhongliang Li, Ligong Lu, Yi Zhang, Gao-Jun Teng

    Published 2024-06-01
    “…This method employs a Multiscale Squeeze Attention (MSA) module for channel-wise attention learning, extracting multiscale attention feature maps from angiographic images. …”
    Get full text
    Article
  18. 338

    An Effective Self-Attention-Based Hybrid Model for Short-Term Traffic Flow Prediction by Zhihong Li, Xiaoyu Wang, Kairan Yang

    Published 2023-01-01
    “…Our new hybrid model gives a higher accuracy than the support vector regression (SVR) model, LSTM neural network-attention (LSTM-attention) model, and temporal convolutional network (TCN) model. …”
    Get full text
    Article
  19. 339
  20. 340

    DMSS: An Attention-Based Deep Learning Model for High-Quality Mass Spectrometry Prediction by Yihui Ren, Yu Wang, Wenkai Han, Yikang Huang, Xiaoyang Hou, Chunming Zhang, Dongbo Bu, Xin Gao, Shiwei Sun

    Published 2024-09-01
    “…In this study, we introduce Deep MS Simulator (DMSS), a novel attention-based model tailored for forecasting theoretical spectra in mass spectrometry. …”
    Get full text
    Article