Showing 241 - 260 results of 13,183 for search '"attention"', query time: 0.07s Refine Results
  1. 241

    Key n-Gram Extractions and Analyses of Different Registers Based on Attention Network by Haiyan Wu, Ying Liu, Shaoyun Shi, Qingfeng Wu, Yunlong Huang

    Published 2021-01-01
    “…By summarizing the advantages and disadvantages of existing models, we propose a novel key n-gram extraction model “attentive n-gram network” (ANN) based on the attention mechanism and multilayer perceptron, in which the attention mechanism scores each n-gram in a sentence by mining the internal semantic relationship between words, and their importance is given by the scores. …”
    Get full text
    Article
  2. 242

    Attention and sentiment of Chinese public toward rural landscape based on Sina Weibo by Jinji Zhang, Guanghu Jin, Yang Liu, Xiyue Xue

    Published 2024-06-01
    “…The research reveals that the Chinese public’s attention to rural landscapes has significantly increased with the evolution of government governance concepts. …”
    Get full text
    Article
  3. 243
  4. 244
  5. 245

    An Attention-Based Multidimensional Fault Information Sharing Framework for Bearing Fault Diagnosis by Yunjin Hu, Qingsheng Xie, Xudong Yang, Hai Yang, Yizong Zhang

    Published 2025-01-01
    “…Aiming at the above problems, this paper proposes an Attention-based Multidimensional Fault Information Sharing (AMFIS) framework, which aims to overcome the difficulties of multidimensional bearing fault diagnosis in a small sample environment. …”
    Get full text
    Article
  6. 246

    Enhancing bowel sound recognition with self-attention and self-supervised pre-training. by Yansuo Yu, Mingwu Zhang, Zhennian Xie, Qiang Liu

    Published 2024-01-01
    “…Our approach integrates the Branchformer architecture, which leverages the power of self-attention and convolutional gating for robust feature extraction, with a self-supervised pre-training strategy. …”
    Get full text
    Article
  7. 247

    Efficient Lane Detection Technique Based on Lightweight Attention Deep Neural Network by Zhiting Yao, Xiyuan Chen

    Published 2022-01-01
    “…Based on the attributes of disparate feature resolution characteristics, different attention mechanisms are adopted to guide the network to effectively exploit the model parameters. …”
    Get full text
    Article
  8. 248
  9. 249

    Multi scale multi attention network for blood vessel segmentation in fundus images by Giri Babu Kande, Madhusudana Rao Nalluri, R. Manikandan, Jaehyuk Cho, Sathishkumar Veerappampalayam Easwaramoorthy

    Published 2025-01-01
    “…In this paper, we introduce a novel approach, the MSMA Net model, which overcomes these challenges by replacing traditional convolution blocks and skip connections with an improved multi-scale squeeze and excitation block (MSSE Block) and Bottleneck residual paths (B-Res paths) with spatial attention blocks (SAB). Our experimental findings on publicly available datasets of fundus images, specifically DRIVE, STARE, CHASE_DB1, HRF and DR HAGIS consistently demonstrate that our approach outperforms other segmentation techniques, achieving higher accuracy, sensitivity, Dice score, and area under the receiver operator characteristic (AUC) in the segmentation of blood vessels with different thicknesses, even in situations involving diverse contextual information, the presence of coexisting lesions, and intricate vessel morphologies.…”
    Get full text
    Article
  10. 250

    Medical image segmentation based on frequency domain decomposition SVD linear attention by Liu Qiong, Li Chaofan, Teng Jinnan, Chen Liping, Song Jianxiang

    Published 2025-01-01
    “…During attention feature computation, we introduce Singular Value Decomposition (SVD) to extract an effective representation matrix from the original image, which is then applied in the attention computation process for linear projection. …”
    Get full text
    Article
  11. 251
  12. 252
  13. 253

    An Optimized Deep-Learning-Based Network with an Attention Module for Efficient Fire Detection by Muhammad Altaf, Muhammad Yasir, Naqqash Dilshad, Wooseong Kim

    Published 2025-01-01
    “…In the subsequent phase, the proposed network utilizes an attention-based deep neural network (DNN) named Xception for detailed feature selection while reducing the computational cost, followed by adaptive spatial attention (ASA) to further enhance the model’s focus on a relevant spatial feature in the training data. …”
    Get full text
    Article
  14. 254

    The Dynamics of the Relationship between Attention Deficit Hyperactivity Disorder and Alcohol Use Disorder by Sevda Acar, Yeliz Aktaş

    Published 2024-12-01
    “…Attention-deficit hyperactivity disorder is defined as a neurodevelopmental disorder that begins in child- hood and persists into adulthood. …”
    Get full text
    Article
  15. 255

    Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features by Chaoyue Ma, Ying Wang, Feng Li, Huiyan Zhang, Yong Zhang, Haiyan Zhang

    Published 2024-01-01
    “…This paper proposes a variational autoencoder (VAE) long short-term memory (LSTM) load model based on the attention mechanism (Attention). First, the Prophet data decomposition method is used to decompose long sequences of load data at multiple time scales. …”
    Get full text
    Article
  16. 256
  17. 257

    Spatiotemporal dynamic and regional differences of public attention to vaccination: An empirical study in China. by Yaming Zhang, Xiaoyu Guo, Yanyuan Su

    Published 2024-01-01
    “…First, there are significant seasonal fluctuations and unbalanced monthly distributions of vaccination-related public attention in China. Second, the public attention in Chinese cities shows the spatial characteristics of "leading in the east, followed by the central, western and northeastern regions". …”
    Get full text
    Article
  18. 258
  19. 259

    Long-term mindfulness meditation increases occurrence of sensory and attention brain states by Daniel Yochai Panitz, Daniel Yochai Panitz, Daniel Yochai Panitz, Avi Mendelsohn, Avi Mendelsohn, Avi Mendelsohn, Joana Cabral, Joana Cabral, Joana Cabral, Aviva Berkovich-Ohana, Aviva Berkovich-Ohana

    Published 2025-01-01
    “…These findings suggest that, by shifting attention toward enhanced sensory and embodied processing, MM effectively modulates the expression of functional network states at rest. …”
    Get full text
    Article
  20. 260