Showing 21 - 40 results of 59 for search 'Bidirectional encoder presentation from transformed model', query time: 0.16s Refine Results
  1. 21

    Information extraction from green channel textual records on expressways using hybrid deep learning by Jiaona Chen, Jing Zhang, Weijun Tao, Yinli Jin, Heng Fan

    Published 2024-12-01
    “…Eight entities are designed and proposed in the NER processing for the expressway green channel. three typical pre-trained natural language processing models are utilized and compared to recognize entities and obtain feature vectors, including bidirectional encoder representations from transformer (BERT), ALBERT, and RoBERTa. …”
    Get full text
    Article
  2. 22

    Detecting sarcasm in user-generated content integrating transformers and gated graph neural networks by Zhenkai Qin, Qining Luo, Zhidong Zang, Hongpeng Fu

    Published 2025-04-01
    “…To address this issue, the present study proposes a novel sarcasm detection model that combines bidirectional encoder representations from transformers (BERT) with gated graph neural networks (GGNN), further enhanced by a self-attention mechanism to more effectively capture ironic cues. …”
    Get full text
    Article
  3. 23

    Leveraging Multilingual Transformer for Multiclass Sentiment Analysis in Code-Mixed Data of Low-Resource Languages by Muhammad Kashif Nazir, Cm Nadeem Faisal, Muhammad Asif Habib, Haseeb Ahmad

    Published 2025-01-01
    “…Subsequently, the Multilingual Bidirectional Encoder Representations from Transformers (mBERT) model was optimized and trained for multiclass sentiment analysis on the code-mixed data. …”
    Get full text
    Article
  4. 24

    TSB-Forecast: A Short-Term Load Forecasting Model in Smart Cities for Integrating Time Series Embeddings and Large Language Models by Mohamed Mahmoud Hasan, Neamat El-Tazi, Ramadan Moawad, Amany H. B. Eissa

    Published 2025-01-01
    “…The model uses Sentence Bidirectional Encoder Representations from Transformers (SBERT) to extract semantic characteristics from textual news and Time to Vector (Time2Vec) to capture temporal patterns, acquiring cyclical behavior and context-sensitive impacts. …”
    Get full text
    Article
  5. 25

    EYE-Llama, an in-domain large language model for ophthalmology by Tania Haghighi, Sina Gholami, Jared Todd Sokol, Enaika Kishnani, Adnan Ahsaniyan, Holakou Rahmanian, Fares Hedayati, Theodore Leng, Minhaj Nur Alam

    Published 2025-07-01
    “…We evaluated EYE-Llama against Llama 2, Llama 3, Meditron, ChatDoctor, ChatGPT, and several other LLMs. Using BERT (Bidirectional Encoder Representations from Transformers) score, BART (Bidirectional and Auto-Regressive Transformer) score, and BLEU (Bilingual Evaluation Understudy) metrics, EYE-Llama achieved superior scores. …”
    Get full text
    Article
  6. 26

    VitroBert: modeling DILI by pretraining BERT on in vitro data by Muhammad Arslan Masood, Anamya Ajjolli Nagaraja, Katia Belaid, Natalie Mesens, Hugo Ceulemans, Samuel Kaski, Dorota Herman, Markus Heinonen

    Published 2025-08-01
    “…We therefore introduce VitroBERT, a bidirectional encoder representations from transformers (BERT) model pretrained on large-scale in vitro assay profiles to generate biologically informed molecular embeddings. …”
    Get full text
    Article
  7. 27

    Electric Vehicle Sentiment Analysis Using Large Language Models by Hemlata Sharma, Faiz Ud Din, Bayode Ogunleye

    Published 2024-11-01
    “…EV companies are becoming significant competitors in the automotive industry and are projected to cover up to 30% of the United States light vehicle market by 2030 In this study, we present a comparative study of large language models (LLMs) including bidirectional encoder representations from transformers (BERT), robustly optimised BERT (RoBERTa), and a generalised autoregressive pre-training method (XLNet) using Lucid Motors and Tesla Motors YouTube datasets. …”
    Get full text
    Article
  8. 28

    Acoustic Event Detection in Vehicles: A Multi-Label Classification Approach by Anaswara Antony, Wolfgang Theimer, Giovanni Grossetti, Christoph M. Friedrich

    Published 2025-04-01
    “…The proposed detection methodology uses the pre-trained network Bidirectional Encoder representation from Audio Transformers (BEATs) and a single-layer neural network trained on the database of real audio recordings collected from different cars. …”
    Get full text
    Article
  9. 29

    From Extractive to Generative: An Analysis of Automatic Text Summarization Techniques by Liu Zixu

    Published 2025-01-01
    “…The review highlights significant milestones in the development of summarization algorithms, including the emergence of Transformer-based models like Bidirectional Encoder Representations from Transformers (BERT) and Generative Pre-trained Transformer (GPT), which have significantly improved the quality and coherence of generated summaries. …”
    Get full text
    Article
  10. 30

    Ransomware detection and family classification using fine-tuned BERT and RoBERTa models by Amjad Hussain, Ayesha Saadia, Faeiz M. Alserhani

    Published 2025-06-01
    “…This research explores these challenges and proposes a novel approach using hyperparameter-optimized transfer learning-based models, Bidirectional Encoder Representations from Transformers (BERT), and a Robustly Optimized BERT Approach (RoBERTa), to not only detect but also classify ransomware targeting IoT devices by analyzing dynamically executed API call sequences in a sandbox environment. …”
    Get full text
    Article
  11. 31

    A fake news detection model using the integration of multimodal attention mechanism and residual convolutional network by Ying Lu, Naiwei Yao

    Published 2025-07-01
    “…Baseline models used for comparison include Bidirectional Encoder Representations from Transformers (BERT), Robustly Optimized Bidirectional Encoder Representations from Transformers Approach (RoBERTa), Generalized Autoregressive Pretraining for Language Understanding (XLNet), Enhanced Representation through Knowledge Integration (ERNIE), and Generative Pre-trained Transformer 3.5 (GPT-3.5). …”
    Get full text
    Article
  12. 32

    Does the Choice of Topic Modeling Technique Impact the Interpretation of Aviation Incident Reports? A Methodological Assessment by Aziida Nanyonga, Keith Joiner, Ugur Turhan, Graham Wild

    Published 2025-05-01
    “…This study presents a comparative analysis of four topic modeling techniques —Latent Dirichlet Allocation (LDA), Bidirectional Encoder Representations from Transformers (BERT), Probabilistic Latent Semantic Analysis (pLSA), and Non-negative Matrix Factorization (NMF)—applied to aviation safety reports from the ATSB dataset spanning 2013–2023. …”
    Get full text
    Article
  13. 33

    A Cross-Product Analysis of Earphone Reviews Using Contextual Topic Modeling and Association Rule Mining by Ugbold Maidar, Minyoung Ra, Donghee Yoo

    Published 2024-12-01
    “…It employs Bidirectional Encoder Representations from Transformers for Topic Modeling (BERTopic), a technique that generates coherent topics by effectively capturing contextual information, and Frequent Pattern Growth (FPGrowth), an efficient association rule mining algorithm used for discovering patterns and relationships in a dataset without candidate generation. …”
    Get full text
    Article
  14. 34

    IndoGovBERT: A Domain-Specific Language Model for Processing Indonesian Government SDG Documents by Agus Riyadi, Mate Kovacs, Uwe Serdült, Victor Kryssanov

    Published 2024-11-01
    “…The presented study introduces IndoGovBERT, a Bidirectional Encoder Representations from Transformers (BERT)-based PTLM built with domain-specific corpora, leveraging the Indonesian government’s public and internal documents. …”
    Get full text
    Article
  15. 35

    Obfuscated Malware Detection and Classification in Network Traffic Leveraging Hybrid Large Language Models and Synthetic Data by Mehwish Naseer, Farhan Ullah, Samia Ijaz, Hamad Naeem, Amjad Alsirhani, Ghadah Naif Alwakid, Abdullah Alomari

    Published 2025-01-01
    “…This phase leverages a fine-tuned LLM, Bidirectional Encoder Representations from Transformers (BERT), with classification layers. …”
    Get full text
    Article
  16. 36

    Multi-Head Graph Attention Adversarial Autoencoder Network for Unsupervised Change Detection Using Heterogeneous Remote Sensing Images by Meng Jia, Xiangyu Lou, Zhiqiang Zhao, Xiaofeng Lu, Zhenghao Shi

    Published 2025-07-01
    “…The MHGAN employs a bidirectional adversarial convolutional autoencoder network to reconstruct and perform style transformation of heterogeneous images. …”
    Get full text
    Article
  17. 37

    A novel ViT-BILSTM model for physical activity intensity classification in adults using gravity-based acceleration by Lin Wang, Zizhang Luo, Tianle Zhang

    Published 2025-02-01
    “…Abstract Aim The aim of this study is to apply a novel hybrid framework incorporating a Vision Transformer (ViT) and bidirectional long short-term memory (Bi-LSTM) model for classifying physical activity intensity (PAI) in adults using gravity-based acceleration. …”
    Get full text
    Article
  18. 38

    A Deep Learning Model for Automatic Citation Document Recommendation in Non-Obviousness Judgment: Using BERT-for-patents and Contrastive Learning by Dongkun Yoo, Jiheon Han

    Published 2025-03-01
    “…The United States Patent and Trademark Office (USPTO) patent data rejected because of a lack of non-obviousness were preprocessed. Six models were trained based on the bidirectional encoder representations from transformers (BERT), and the performances were compared. …”
    Get full text
    Article
  19. 39

    Identifying Non-Functional Requirements From Unconstrained Documents Using Natural Language Processing and Machine Learning Approaches by Qais A. Shreda, Abualsoud A. Hanani

    Published 2025-01-01
    “…In our approach, features were extracted from the requirement sentences using four different natural language processing methods including statistical and state-of-the-art semantic analysis presented by Google word2vec and bidirectional encoder representations from transformers models. …”
    Get full text
    Article
  20. 40

    Enhancing Pulmonary Disease Prediction Using Large Language Models With Feature Summarization and Hybrid Retrieval-Augmented Generation: Multicenter Methodological Study Based on R... by Ronghao Li, Shuai Mao, Congmin Zhu, Yingliang Yang, Chunting Tan, Li Li, Xiangdong Mu, Honglei Liu, Yuqing Yang

    Published 2025-06-01
    “…The traditional deep learning model, BERT (Bidirectional Encoder Representations from Transformers), was also compared to assess the superiority of LLMs. …”
    Get full text
    Article