-
361
MAF-CNER : A Chinese Named Entity Recognition Model Based on Multifeature Adaptive Fusion
Published 2021-01-01“…Named entity recognition (NER) is a subtask in natural language processing, and its accuracy greatly affects the effectiveness of downstream tasks. …”
Get full text
Article -
362
Enhancing Essay Scoring: An Analytical and Holistic Approach With Few-Shot Transformer-Based Models
Published 2025-01-01“…Despite the impressive capabilities of generalized transformer models in various natural language processing (NLP) domains, their application to essay scoring has often fallen short of expectations. …”
Get full text
Article -
363
Crowdsourcing geographic information for terrorism-related disaster awareness and mitigation: perspectives and challenges
Published 2024-12-01“…Despite the prevalence of natural language processing for data mining, the majority of studies did not incorporate ML algorithms in their analyses. …”
Get full text
Article -
364
Research on the robustness of neural machine translation systems in word order perturbation
Published 2023-10-01“…Pre-trained language model is one of the most important models in the natural language processing field, as pre-train-finetune has become the paradigm in various NLP downstream tasks.Previous studies have proved integrating pre-trained language models (e.g., BERT) into neural machine translation (NMT) models can improve translation performance.However, it is still unclear whether these improvements stem from enhanced semantic or syntactic modeling capabilities, as well as how pre-trained knowledge impacts the robustness of the models.To address these questions, a systematic study was conducted to examine the syntactic ability of BERT-enhanced NMT models using probing tasks.The study revealed that the enhanced models showed proficiency in modeling word order, highlighting their syntactic modeling capabilities.In addition, an attacking method was proposed to evaluate the robustness of NMT models in handling word order.BERT-enhanced NMT models yielded better translation performance in most of the tasks, indicating that BERT can improve the robustness of NMT models.It was observed that BERT-enhanced NMT model generated poorer translations than vanilla NMT model after attacking in the English-German translation task, which meant that English BERT worsened model robustness in such a scenario.Further analyses revealed that English BERT failed to bridge the semantic gap between the original and perturbed sources, leading to more copying errors and errors in translating low-frequency words.These findings suggest that the benefits of pre-training may not always be consistent in downstream tasks, and careful consideration should be given to its usage.…”
Get full text
Article -
365
Vision Transformers for Image Classification: A Comparative Survey
Published 2025-01-01“…Transformers were initially introduced for natural language processing, leveraging the self-attention mechanism. …”
Get full text
Article -
366
Enhancing zero-shot relation extraction with a dual contrastive learning framework and a cross-attention module
Published 2024-11-01“…Abstract Zero-shot relation extraction (ZSRE) is essential for improving the understanding of natural language relations and enhancing the accuracy and efficiency of natural language processing methods in practical applications. However, the existing ZSRE models ignore the importance of semantic information fusion and possess limitations when used for zero-shot relation extraction tasks. …”
Get full text
Article -
367
AzSLD: Azerbaijani sign language dataset for fingerspelling, word, and sentence translation with baseline softwareZenodo
Published 2025-02-01“…Advancements in sign language processing technology hinge on the availability of extensive, reliable datasets, comprehensive instructions, and adherence to ethical guidelines. …”
Get full text
Article -
368
Hyperbolic Graph Convolutional Network Relation Extraction Model Combining Dependency Syntax and Contrastive Learning
Published 2025-02-01“…However, most studies are affected by the noise in the syntactic information automatically extracted by natural language processing toolkits. Additionally, traditional pre-training encoders have issues such as an overly centralized representation of word embedding for high-frequency words, which adversely affects the model to learn contextual semantic information. …”
Get full text
Article -
369
METAGRAPH THEORY AS A BASIS FOR MODELING RELEVANT MEDIA DISCOURSE
Published 2024-11-01Get full text
Article -
370
Cross-modality fusion with EEG and text for enhanced emotion detection in English writing
Published 2025-01-01“…Traditional approaches to emotion detection primarily leverage textual features, using natural language processing techniques such as sentiment analysis, which, while effective, may miss subtle nuances of emotions. …”
Get full text
Article -
371
DNA promoter task-oriented dictionary mining and prediction model based on natural language technology
Published 2025-01-01“…Recent advancements in bioinformatics have leveraged deep learning and natural language processing (NLP) to enhance promoter prediction accuracy. …”
Get full text
Article -
372
Low-Resource Active Learning of Morphological Segmentation
Published 2016-03-01“… Many Uralic languages have a rich morphological structure, but lack morphological analysis tools needed for efficient language processing. While creating a high-quality morphological analyzer requires a significant amount of expert labor, data-driven approaches may provide sufficient quality for many applications. …”
Get full text
Article -
373
The Challenges of Gender Diversity in Boards of Directors: An Australian Study with Global Implications
Published 2025-02-01“…In‐depth interviews are conducted of those with first‐hand experience of board appointments, followed by the thematic analysis and the application of natural language processing techniques to identify emotions and sentiment associated with these themes. …”
Get full text
Article -
374
Transforming dental diagnostics with artificial intelligence: advanced integration of ChatGPT and large language models for patient care
Published 2025-01-01“…Artificial intelligence has dramatically reshaped our interaction with digital technologies, ushering in an era where advancements in AI algorithms and Large Language Models (LLMs) have natural language processing (NLP) systems like ChatGPT. This study delves into the impact of cutting-edge LLMs, notably OpenAI's ChatGPT, on medical diagnostics, with a keen focus on the dental sector. …”
Get full text
Article -
375
Neuroanatomy, episodic memory and inhibitory control of Persian-Kurdish simultaneous bilinguals
Published 2024-11-01“…Abstract We assessed simultaneous bilinguals and monolinguals on inhibitory control and episodic memory, and assessed their grey matter volumes in brain regions known to be involved in language processing, executive control and memory. Bilinguals outperformed monolinguals on episodic memory, and performance on the memory and inhibition tasks were correlated, only in the bilingual group. …”
Get full text
Article -
376
Data Augmentation For Sorani Kurdish News Headline Classification Using Back-Translation And Deep Learning Model
Published 2023-06-01“…The findings suggest that the combination of back-translation and a proposed BiLSTM model is a promising approach for data augmentation in low-resource languages, contributing to the advancement of natural language processing in under-resourced languages. Moreover, having a Kurdish news headline classification model can improve access to news and information for Kurdish speakers. …”
Get full text
Article -
377
Presenting a Novel Hybrid Approach of Text Mining Sentiment Analysis in Twitter Using CART Decision Tree
Published 2020-03-01“…Text mining, as a special strategy, drives the knowledge discovery process, which uses non-verbal and attractive patterns of natural language processing. In this paper, a new hybrid approach of machine learning and vocabulary-based method to text-mining sentiment analysis on Twitter. …”
Get full text
Article -
378
INTERPRETING METAPHORICAL LANGUAGE: A CHALLENGE TO ARTIFICIAL INTELLIGENCE
Published 2024-11-01Get full text
Article -
379
An End-to-End Rumor Detection Model Based on Feature Aggregation
Published 2021-01-01“…Furthermore, the features used by the deep learning method based on natural language processing are heavily limited. Therefore, it is of great significance and practical value to study the rumor detection method independent of feature engineering and effectively aggregate heterogeneous features to adapt to the complex and variable social network. …”
Get full text
Article -
380
LAMARS: Large Language Model-Based Anticipation Mechanism Acceleration in Real-Time Robotic Systems
Published 2025-01-01“…LAMARS leverages the predictive power and zero-shot capabilities of LLMs combined with an anticipation mechanism and vision-language processing to position a robot in advance for upcoming tasks. …”
Get full text
Article