-
21
Information extraction from green channel textual records on expressways using hybrid deep learning
Published 2024-12-01“…Eight entities are designed and proposed in the NER processing for the expressway green channel. three typical pre-trained natural language processing models are utilized and compared to recognize entities and obtain feature vectors, including bidirectional encoder representations from transformer (BERT), ALBERT, and RoBERTa. …”
Get full text
Article -
22
Detecting sarcasm in user-generated content integrating transformers and gated graph neural networks
Published 2025-04-01“…To address this issue, the present study proposes a novel sarcasm detection model that combines bidirectional encoder representations from transformers (BERT) with gated graph neural networks (GGNN), further enhanced by a self-attention mechanism to more effectively capture ironic cues. …”
Get full text
Article -
23
Leveraging Multilingual Transformer for Multiclass Sentiment Analysis in Code-Mixed Data of Low-Resource Languages
Published 2025-01-01“…Subsequently, the Multilingual Bidirectional Encoder Representations from Transformers (mBERT) model was optimized and trained for multiclass sentiment analysis on the code-mixed data. …”
Get full text
Article -
24
TSB-Forecast: A Short-Term Load Forecasting Model in Smart Cities for Integrating Time Series Embeddings and Large Language Models
Published 2025-01-01“…The model uses Sentence Bidirectional Encoder Representations from Transformers (SBERT) to extract semantic characteristics from textual news and Time to Vector (Time2Vec) to capture temporal patterns, acquiring cyclical behavior and context-sensitive impacts. …”
Get full text
Article -
25
EYE-Llama, an in-domain large language model for ophthalmology
Published 2025-07-01“…We evaluated EYE-Llama against Llama 2, Llama 3, Meditron, ChatDoctor, ChatGPT, and several other LLMs. Using BERT (Bidirectional Encoder Representations from Transformers) score, BART (Bidirectional and Auto-Regressive Transformer) score, and BLEU (Bilingual Evaluation Understudy) metrics, EYE-Llama achieved superior scores. …”
Get full text
Article -
26
VitroBert: modeling DILI by pretraining BERT on in vitro data
Published 2025-08-01“…We therefore introduce VitroBERT, a bidirectional encoder representations from transformers (BERT) model pretrained on large-scale in vitro assay profiles to generate biologically informed molecular embeddings. …”
Get full text
Article -
27
Electric Vehicle Sentiment Analysis Using Large Language Models
Published 2024-11-01“…EV companies are becoming significant competitors in the automotive industry and are projected to cover up to 30% of the United States light vehicle market by 2030 In this study, we present a comparative study of large language models (LLMs) including bidirectional encoder representations from transformers (BERT), robustly optimised BERT (RoBERTa), and a generalised autoregressive pre-training method (XLNet) using Lucid Motors and Tesla Motors YouTube datasets. …”
Get full text
Article -
28
Acoustic Event Detection in Vehicles: A Multi-Label Classification Approach
Published 2025-04-01“…The proposed detection methodology uses the pre-trained network Bidirectional Encoder representation from Audio Transformers (BEATs) and a single-layer neural network trained on the database of real audio recordings collected from different cars. …”
Get full text
Article -
29
From Extractive to Generative: An Analysis of Automatic Text Summarization Techniques
Published 2025-01-01“…The review highlights significant milestones in the development of summarization algorithms, including the emergence of Transformer-based models like Bidirectional Encoder Representations from Transformers (BERT) and Generative Pre-trained Transformer (GPT), which have significantly improved the quality and coherence of generated summaries. …”
Get full text
Article -
30
Ransomware detection and family classification using fine-tuned BERT and RoBERTa models
Published 2025-06-01“…This research explores these challenges and proposes a novel approach using hyperparameter-optimized transfer learning-based models, Bidirectional Encoder Representations from Transformers (BERT), and a Robustly Optimized BERT Approach (RoBERTa), to not only detect but also classify ransomware targeting IoT devices by analyzing dynamically executed API call sequences in a sandbox environment. …”
Get full text
Article -
31
A fake news detection model using the integration of multimodal attention mechanism and residual convolutional network
Published 2025-07-01“…Baseline models used for comparison include Bidirectional Encoder Representations from Transformers (BERT), Robustly Optimized Bidirectional Encoder Representations from Transformers Approach (RoBERTa), Generalized Autoregressive Pretraining for Language Understanding (XLNet), Enhanced Representation through Knowledge Integration (ERNIE), and Generative Pre-trained Transformer 3.5 (GPT-3.5). …”
Get full text
Article -
32
Does the Choice of Topic Modeling Technique Impact the Interpretation of Aviation Incident Reports? A Methodological Assessment
Published 2025-05-01“…This study presents a comparative analysis of four topic modeling techniques —Latent Dirichlet Allocation (LDA), Bidirectional Encoder Representations from Transformers (BERT), Probabilistic Latent Semantic Analysis (pLSA), and Non-negative Matrix Factorization (NMF)—applied to aviation safety reports from the ATSB dataset spanning 2013–2023. …”
Get full text
Article -
33
A Cross-Product Analysis of Earphone Reviews Using Contextual Topic Modeling and Association Rule Mining
Published 2024-12-01“…It employs Bidirectional Encoder Representations from Transformers for Topic Modeling (BERTopic), a technique that generates coherent topics by effectively capturing contextual information, and Frequent Pattern Growth (FPGrowth), an efficient association rule mining algorithm used for discovering patterns and relationships in a dataset without candidate generation. …”
Get full text
Article -
34
IndoGovBERT: A Domain-Specific Language Model for Processing Indonesian Government SDG Documents
Published 2024-11-01“…The presented study introduces IndoGovBERT, a Bidirectional Encoder Representations from Transformers (BERT)-based PTLM built with domain-specific corpora, leveraging the Indonesian government’s public and internal documents. …”
Get full text
Article -
35
Obfuscated Malware Detection and Classification in Network Traffic Leveraging Hybrid Large Language Models and Synthetic Data
Published 2025-01-01“…This phase leverages a fine-tuned LLM, Bidirectional Encoder Representations from Transformers (BERT), with classification layers. …”
Get full text
Article -
36
Multi-Head Graph Attention Adversarial Autoencoder Network for Unsupervised Change Detection Using Heterogeneous Remote Sensing Images
Published 2025-07-01“…The MHGAN employs a bidirectional adversarial convolutional autoencoder network to reconstruct and perform style transformation of heterogeneous images. …”
Get full text
Article -
37
A novel ViT-BILSTM model for physical activity intensity classification in adults using gravity-based acceleration
Published 2025-02-01“…Abstract Aim The aim of this study is to apply a novel hybrid framework incorporating a Vision Transformer (ViT) and bidirectional long short-term memory (Bi-LSTM) model for classifying physical activity intensity (PAI) in adults using gravity-based acceleration. …”
Get full text
Article -
38
A Deep Learning Model for Automatic Citation Document Recommendation in Non-Obviousness Judgment: Using BERT-for-patents and Contrastive Learning
Published 2025-03-01“…The United States Patent and Trademark Office (USPTO) patent data rejected because of a lack of non-obviousness were preprocessed. Six models were trained based on the bidirectional encoder representations from transformers (BERT), and the performances were compared. …”
Get full text
Article -
39
Identifying Non-Functional Requirements From Unconstrained Documents Using Natural Language Processing and Machine Learning Approaches
Published 2025-01-01“…In our approach, features were extracted from the requirement sentences using four different natural language processing methods including statistical and state-of-the-art semantic analysis presented by Google word2vec and bidirectional encoder representations from transformers models. …”
Get full text
Article -
40
Enhancing Pulmonary Disease Prediction Using Large Language Models With Feature Summarization and Hybrid Retrieval-Augmented Generation: Multicenter Methodological Study Based on R...
Published 2025-06-01“…The traditional deep learning model, BERT (Bidirectional Encoder Representations from Transformers), was also compared to assess the superiority of LLMs. …”
Get full text
Article