Showing 41 - 59 results of 59 for search 'Bidirectional encoder presentation from transformed model', query time: 0.12s Refine Results
  1. 41

    Innovative Sentiment Analysis and Prediction of Stock Price Using FinBERT, GPT-4 and Logistic Regression: A Data-Driven Approach by Olamilekan Shobayo, Sidikat Adeyemi-Longe, Olusogo Popoola, Bayode Ogunleye

    Published 2024-10-01
    “…This study explores the comparative performance of cutting-edge AI models, i.e., Finaance Bidirectional Encoder representations from Transsformers (FinBERT), Generatice Pre-trained Transformer GPT-4, and Logistic Regression, for sentiment analysis and stock index prediction using financial news and the NGX All-Share Index data label. …”
    Get full text
    Article
  2. 42

    Evaluating Handwritten Answers Using DeepSeek: A Comparative Analysis of Deep Learning-Based Assessment by Sanskar Bansal, Vinay Gupta, Eshita Gupta, Peeyush Garg

    Published 2025-08-01
    “…The state-of-the-art technique known as Bidirectional Encoders Representation from Transformer (BERT) has overcome the drawbacks of previous NLP techniques like Bag of Words, TF-IDF, and Word2Vec. …”
    Get full text
    Article
  3. 43

    Sporting a virtual future: exploring sports and virtual reality patents using deep learning-based analysis by Jea Woog Lee, Sangmin Song, JungMin Yun, Doug Hyun Han, YoungBin Kim

    Published 2025-06-01
    “…Using patent big data, we introduce SportsBERT, a bidirectional encoder representation from transformers (BERT)-based algorithm tailored for enhanced natural language processing in sports-related knowledge-based documents. …”
    Get full text
    Article
  4. 44

    Rethinking Technological Investment and Cost-Benefit: A Software Requirements Dependency Extraction Case Study by Gouri Ginde, Guenther Ruhe, Chad Saunders

    Published 2025-01-01
    “…Specifically, we extract dependencies from textual descriptions of software requirements and analyze the performance of two state-of-the-art ML techniques: Random Forest and Bidirectional Encoder Representations from Transformers (BERT), a encoder only Large Language Model. …”
    Get full text
    Article
  5. 45

    Comparison of Deep Learning Sentiment Analysis Methods, Including LSTM and Machine Learning by Jean Max T. Habib, A. A. Poguda

    Published 2023-11-01
    “…In this case, it is crucial for researchers to explore the possibilities of updating certain tools, either to combine them or to develop them to adapt them to modern tasks in order to provide a clearer understanding of the results of their treatment. We present a comparison of several deep learning models, including convolutional neural networks, recurrent neural networks, and long-term and shortterm bidirectional memory, evaluated using different approaches to word integration, including Bidirectional Encoder Representations from Transformers (BERT) and its variants, FastText and Word2Vec. …”
    Get full text
    Article
  6. 46

    Tackling misinformation in mobile social networks a BERT-LSTM approach for enhancing digital literacy by Jun Wang, Xiulai Wang, Airong Yu

    Published 2025-01-01
    “…Early detection of misinformation is essential yet challenging, particularly in contexts where initial content propagation lacks user feedback and engagement data. This study presents a novel hybrid model that combines Bidirectional Encoder Representations from Transformers (BERT) with Long Short-Term Memory (LSTM) networks to enhance the detection of misinformation using only textual content. …”
    Get full text
    Article
  7. 47

    Rumor detection using dual embeddings and text-based graph convolutional network by Barsha Pattanaik, Sourav Mandal, Rudra M. Tripathy, Arif Ahmed Sekh

    Published 2024-11-01
    “…This model uses dual embedding from two pre-trained transformer models: generative pre-trained transformers (GPT) and bidirectional encoder representations from transformers (BERT). …”
    Get full text
    Article
  8. 48

    ENHANCING NAMED ENTITY RECOGNITION ON HINER DATASET USING ADVANCED NLP TECHNIQUES by Harshvardhan Pardeshi, Prof. Piyush Pratap Singh

    Published 2025-05-01
    “…To solve this issue, some researchers have concentrated on NER models. Conversely, it lacks speed and accuracy. Therefore, the present research uses advanced NLP models such as bidirectional encoder representations from transformers (BERT), Distil BERT and the robustly optimized BERT approach (RoBERTA) for effective entity prediction performance. …”
    Get full text
    Article
  9. 49

    Optimizing an LSTM Self-Attention Architecture for Portuguese Sentiment Analysis Using a Genetic Algorithm by Daniel Parada, Alexandre Branco, Marcos Silva, Fábio Mendonça, Sheikh Mostafa, Fernando Morgado-Dias

    Published 2025-06-01
    “…To address this complexity, a discrete genetic algorithm was used to find an optimal configuration, selecting the layer types, placement of self-attention, dropout rate, and model dimensions and shape. A key outcome of this study was that the optimization process produced a model that is competitive with a Bidirectional Encoder Representation from Transformers (BERT) model retrained for Portuguese, which was used as the baseline. …”
    Get full text
    Article
  10. 50

    Graph neural networks embedded with domain knowledge for cyber threat intelligence entity and relationship mining by Gan Liu, Kai Lu, Saiqi Pi

    Published 2025-04-01
    “…Specifically, first, domain knowledge is collected to build a domain knowledge graph, which is then embedded using graph convolutional networks (GCN) to enhance the feature representation of threat intelligence text. Next, the features from domain knowledge graph embedding and those generated by the bidirectional encoder representations from transformers (BERT) model are fused using the Layernorm algorithm. …”
    Get full text
    Article
  11. 51

    Vietnamese Sentence Fact Checking Using the Incremental Knowledge Graph, Deep Learning, and Inference Rules on Online Platforms by Huong To Duong, Van Hai Ho, Phuc do

    Published 2025-01-01
    “…To address these challenges and advance fact-checking research both broadly and within the Vietnamese context, this paper introduces a fact-checking model tailored for Vietnamese, named ViKGFC. ViKGFC integrates a Knowledge Graph (KG), inference rules, and the Knowledge graph - Bidirectional Encoder Representations from Transformers (KG-BERT) deep learning model. …”
    Get full text
    Article
  12. 52

    Detecting Chinese Disinformation with Fine–Tuned BERT and Contextual Techniques by Lixin Yun, Sheng Yun, Haoran Xue

    Published 2025-12-01
    “…Building on large language models (LLMs) like BERT (Bidirectional Encoder Representations from Transformers) provides a promising avenue for addressing this challenge. …”
    Get full text
    Article
  13. 53

    Automated and efficient Bangla signboard detection, text extraction, and novel categorization method for underrepresented languages in smart cities by Tanmoy Mazumder, Fariha Nusrat, Abu Bakar Siddique Mahi, Jolekha Begum Brishty, Rashik Rahman, Tanjina Helaly

    Published 2025-06-01
    “…Finally, fine-tuning of the pre-trained multilingual Bidirectional Encoder Representations from Transformers (BERT) model is implemented to mimic human perception to achieve Named Entity Recognition (NER) capabilities. …”
    Get full text
    Article
  14. 54

    A Novel HDF-Based Data Compression and Integration Approach to Support BIM-GIS Practical Applications by Zeyu Pan, Jianyong Shi, Liu Jiang

    Published 2020-01-01
    “…Next, bidirectional transformation methods for BIM and GIS modeling data, images, and analytical data into HDF are proposed. …”
    Get full text
    Article
  15. 55

    Analysis of Short Texts Using Intelligent Clustering Methods by Jamalbek Tussupov, Akmaral Kassymova, Ayagoz Mukhanova, Assyl Bissengaliyeva, Zhanar Azhibekova, Moldir Yessenova, Zhanargul Abuova

    Published 2025-05-01
    “…This article presents a comprehensive review of short text clustering using state-of-the-art methods: Bidirectional Encoder Representations from Transformers (BERT), Term Frequency-Inverse Document Frequency (TF-IDF), and the novel hybrid method Latent Dirichlet Allocation + BERT + Autoencoder (LDA + BERT + AE). …”
    Get full text
    Article
  16. 56

    Empowering geoportals HCI with task-oriented chatbots through NLP and deep transfer learning by Mohammad H. Vahidnia

    Published 2024-10-01
    “…The notion of deep transfer learning (DTL) was then put into practice by customizing a pre-trained BERT (Bidirectional Encoder Representations from Transformers) model for our particular aim and creating a task-oriented conversational agent. …”
    Get full text
    Article
  17. 57
  18. 58

    Intelligent integration of AI and IoT for advancing ecological health, medical services, and community prosperity by Abdulrahman Alzahrani, Patty Kostkova, Hamoud Alshammari, Safa Habibullah, Ahmed Alzahrani

    Published 2025-08-01
    “…CNN (convolutional neural networks) with transfer learning enabled by Res-Net provides high-accuracy image recognition, which can be used for waste classification. Bidirectional Encoder Representations from Transformers (BERT) allow multilingual users to interact and communicate properly in any linguistic environment. …”
    Get full text
    Article
  19. 59

    M.I.N.I.-KID interviews with adolescents: a corpus-based language analysis of adolescents with depressive disorders and the possibilities of continuation using Chat GPT by Irina Jarvers, Angelika Ecker, Pia Donabauer, Katharina Kampa, Maximilian Weißenbacher, Daniel Schleicher, Stephanie Kandsperger, Romuald Brunner, Bernd Ludwig

    Published 2024-12-01
    “…The transcribed interviews comprised 4,077 question-answer-pairs, with which we predicted the clinical rating (depressive/non-depressive) with use of a feedforward neural network that received BERT (Bidirectional Encoder Representations from Transformers) vectors of interviewer questions and patient answers as input. …”
    Get full text
    Article