Fine-Tuning BERT Models for Multiclass Amharic News Document Categorization
Bidirectional encoder representation from transformer (BERT) models are increasingly being employed in the development of natural language processing (NLP) systems, predominantly for English and other European languages. However, because of the complexity of the language’s morphology and the scarcit...
Saved in:
Main Author: | Demeke Endalie |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2025-01-01
|
Series: | Complexity |
Online Access: | http://dx.doi.org/10.1155/cplx/1884264 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
The BERT Uncased and LSTM Multiclass Classification Model for Traffic Violation Text Classification
by: Komang Ayu Triana Indah, et al.
Published: (2025-01-01) -
Perceived MOOC satisfaction: A review mining approach using machine learning and fine-tuned BERTs
by: Xieling Chen, et al.
Published: (2025-06-01) -
Enhancing Abstractive Multi-Document Summarization with Bert2Bert Model for Indonesian Language
by: Aldi Fahluzi Muharam, et al.
Published: (2025-01-01) -
Leveraging an Enhanced CodeBERT-Based Model for Multiclass Software Defect Prediction via Defect Classification
by: Rida Ghafoor Hussain, et al.
Published: (2025-01-01) -
Improving Text Recognition Accuracy for Serbian Legal Documents Using BERT
by: Miloš Bogdanović, et al.
Published: (2025-01-01)