Showing 101 - 120 results of 9,539 for search '"attention"', query time: 0.06s Refine Results
  1. 101

    Attention-assisted dual-branch interactive face super-resolution network by Xujie Wan, Siyu Xu, Guangwei Gao

    Published 2025-01-01
    “…Additionally, the Attention Feature Fusion Unit (AFFM) optimizes multi-scale feature integration. …”
    Get full text
    Article
  2. 102

    SQL Injection Detection Based on Lightweight Multi-Head Self-Attention by Rui-Teng Lo, Wen-Jyi Hwang, Tsung-Ming Tai

    Published 2025-01-01
    “…For the exploration of correlation among the tokens, a lightweight multi-head self-attention scheme with a position encoder is employed. …”
    Get full text
    Article
  3. 103
  4. 104
  5. 105
  6. 106

    Investor Attention: Can Google Search Volumes Predict Stock Returns? by Claudia Yoshinaga, Fabio Rocco

    Published 2020-01-01
    “…This paper investigates the role of investor attention in predicting future stock market returns for Brazilian stocks using Google Search Volume (GSV). …”
    Get full text
    Article
  7. 107

    Factors associated with attention-deficit/hyperactivity disorder among Tunisian children by Asma Guedria, Asma Guedria, Asma Guedria, Mohamed Guedria, Manel Ben Fredj, Manel Ben Fredj, Randaline Ayoub, Randaline Ayoub, Randaline Ayoub, Hela Ben Abid, Hela Ben Abid, Hela Ben Abid, Ahmed Mhalla, Ahmed Mhalla, Ahmed Mhalla, Hela Slama, Hela Slama

    Published 2025-02-01
    “…IntroductionAttention-deficit/hyperactivity disorder (ADHD) is a chronic neurodevelopmental condition that affects millions of children and adolescents worldwide. …”
    Get full text
    Article
  8. 108
  9. 109

    Exploring the operator experience in automated shuttles: Fatigue, attention, and gaze behaviour by Christer Ahlström, My Weidel, Anna Sjörs Dahlman, Ashleigh Filtness, Anna Anund

    Published 2025-01-01
    “…Operators paid less attention to their surroundings than would be expected (21% not looking left, 38% not looking right, 58% not looking to the rear of the vehicle, in situations where this would have been appropriate).The results are important for safety operators and their employers, highlighting the shared responsibility of having well-prepared and well-rested operators who are fit to effectively monitor the automated shuttle for an entire driving period. …”
    Get full text
    Article
  10. 110
  11. 111

    Text to Realistic Image Generation with Attentional Concatenation Generative Adversarial Networks by Linyan Li, Yu Sun, Fuyuan Hu, Tao Zhou, Xuefeng Xi, Jinchang Ren

    Published 2020-01-01
    “…In this paper, we propose an Attentional Concatenation Generative Adversarial Network (ACGAN) aiming at generating 1024 × 1024 high-resolution images. …”
    Get full text
    Article
  12. 112
  13. 113
  14. 114

    Enhancing Low-Light Images with Kolmogorov–Arnold Networks in Transformer Attention by Alexandru Brateanu, Raul Balmez, Ciprian Orhei, Cosmin Ancuti, Codruta Ancuti

    Published 2025-01-01
    “…This work presents a novel Transformer attention mechanism inspired by the Kolmogorov–Arnold representation theorem, incorporating learnable non-linearity and multivariate function decomposition. …”
    Get full text
    Article
  15. 115
  16. 116
  17. 117

    Voluntary Spatial Attention has Different Effects on Voluntary and Reflexive Saccades by Stephanie K. Seidlits, Tammie Reza, Kevin A. Briand, Anne B. Sereno

    Published 2003-01-01
    “…Some studies have suggested that spatial attention facilitates saccades, whereas others have claimed that eye movements are actually inhibited when spatial attention is engaged. …”
    Get full text
    Article
  18. 118

    The UnconTrust Database for Studies of Unconscious Semantic Processing and Attentional Allocation by Maor Schreiber, Francois Stockart, Liad Mudrik

    Published 2025-01-01
    “…Here, we present the UnconTrust database for studies of unconscious processing focusing on two major domains – semantic and attentional processing. The database allows researchers to explore potential influences and obtain a bird’s eye view on the field with respect to these domains. …”
    Get full text
    Article
  19. 119

    A Spatial-Temporal Self-Attention Network (STSAN) for Location Prediction by Shuang Wang, AnLiang Li, Shuai Xie, WenZhu Li, BoWei Wang, Shuai Yao, Muhammad Asif

    Published 2021-01-01
    “…In STSAN, we design a trajectory attention module to learn users’ dynamic trajectory representation, which includes three modules: location attention, which captures the location sequential transitions with self-attention; spatial attention, which captures user’s preference for geographic location; and temporal attention, which captures the user temporal activity preference. …”
    Get full text
    Article
  20. 120