Fine-Tuning BERT Models for Multiclass Amharic News Document Categorization
Bidirectional encoder representation from transformer (BERT) models are increasingly being employed in the development of natural language processing (NLP) systems, predominantly for English and other European languages. However, because of the complexity of the language’s morphology and the scarcit...
Saved in:
Main Author: | Demeke Endalie |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2025-01-01
|
Series: | Complexity |
Online Access: | http://dx.doi.org/10.1155/cplx/1884264 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Leveraging an Enhanced CodeBERT-Based Model for Multiclass Software Defect Prediction via Defect Classification
by: Rida Ghafoor Hussain, et al.
Published: (2025-01-01) -
Perceived MOOC satisfaction: A review mining approach using machine learning and fine-tuned BERTs
by: Xieling Chen, et al.
Published: (2025-06-01) -
Ontology-based prompt tuning for news article summarization
by: A. R. S. Silva, et al.
Published: (2025-02-01) -
Frozen Weights as Prior for Parameter-Efficient Fine-Tuning
by: Xiaolong Ma, et al.
Published: (2025-01-01) -
Amharic Folkloric Oral Traditions: Collections for Insiders and for Outsiders
by: Peter Unseth, et al.
Published: (2023-03-01)