Showing 1 - 20 results of 1,873 for search '(transformer OR transformed) encoder', query time: 0.15s Refine Results
  1. 1

    FPE–Transformer: A Feature Positional Encoding-Based Transformer Model for Attack Detection by Hande Çavşi Zaim, Esra Nergis Yolaçan

    Published 2025-01-01
    “…FPE–Transformer incorporates an innovative feature positional encoding mechanism that encodes the positional information of each feature separately, enabling a deeper understanding of feature relationships and more precise attack detection. …”
    Get full text
    Article
  2. 2

    Arabic Speech Recognition Based on Encoder-Decoder Architecture of Transformer by Mohanad Sameer, Ahmed Talib, Alla Hussein, Husniza Husni

    Published 2023-03-01
    “…This research presents an Arabic speech recognition based on a transformer encoder-decoder architecture with self-attention to transcribe Arabic audio speech segments into text, which can be trained faster with more efficiency. …”
    Get full text
    Article
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10

    AnoViT: Unsupervised Anomaly Detection and Localization With Vision Transformer-Based Encoder-Decoder by Yunseung Lee, Pilsung Kang

    Published 2022-01-01
    “…Therefore, we propose a vision transformer-based encoder-decoder model, named AnoViT, designed to reflect normal information by additionally learning the global relationship between image patches, which is capable of both image anomaly detection and localization. …”
    Get full text
    Article
  11. 11
  12. 12

    An enhanced network for extracting tunnel lining defects using transformer encoder and aggregate decoder by Bo Guo, Zhihai Huang, Haitao Luo, Perpetual Hope Akwensi, Ruisheng Wang, Bo Huang, Tsz Nam Chan

    Published 2025-02-01
    “…We propose a deep network model utilizing an encoder–decoder framework that integrates Transformer and convolution for comprehensive defect extraction. …”
    Get full text
    Article
  13. 13

    Noise robust aircraft trajectory prediction via autoregressive transformers with hybrid positional encoding by Youyou Li, Yuxiang Fang, Teng Long

    Published 2025-04-01
    “…This study introduces the Noise-Robust Autoregressive Transformer, a novel model that enhances prediction reliability by integrating noise-regularized embeddings within a multi-head attention equipped with hybrid positional encoding. …”
    Get full text
    Article
  14. 14
  15. 15

    The vestibular system implements a linear-nonlinear transformation in order to encode self-motion. by Corentin Massot, Adam D Schneider, Maurice J Chacron, Kathleen E Cullen

    Published 2012-01-01
    “…Although it is well established that the neural code representing the world changes at each stage of a sensory pathway, the transformations that mediate these changes are not well understood. …”
    Get full text
    Article
  16. 16

    Research on EEG signal classification of motor imagery based on AE and Transformer by Rui JIANG, Liuting SUN, Xiaoming WANG, Dapeng LI, Youyun XU

    Published 2023-03-01
    Subjects: “…motor imagery;deep learning;auto-encoder;attention module;Transformer model…”
    Get full text
    Article
  17. 17
  18. 18

    Rolling Bearing Life Prediction Based on Improved Transformer Encoding Layer and Multi-Scale Convolution by Zhuopeng Luo, Zhihai Wang, Xiaoqin Liu, Yingming Yang

    Published 2025-06-01
    “…To accurately and reliably characterize the degradation trend of rolling bearings and predict their life cycle, this paper proposes a bearing life prediction model based on an improved transformer encoder layer and multi-scale convolution. …”
    Get full text
    Article
  19. 19

    An Encoder-Only Transformer Model for Depression Detection from Social Network Data: The DEENT Approach by Robinson Narvaez Burbano, Oscar Mauricio Caicedo Rendon, Carlos. A. Astudillo

    Published 2025-03-01
    “…This paper aims to introduce a model based on encoder-only Transformer architecture to detect depression using a large Twitter dataset collected during the COVID-19 pandemic. …”
    Get full text
    Article
  20. 20