Improving Word Embedding Using Variational Dropout
Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings. We present a novel method - Orthogonal Auto Encoder with Variational Dropout (OAEVD) for improving word embe...
Saved in:
| Main Authors: | Zainab Albujasim, Diana Inkpen, Xuejun Han, Yuhong Guo |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
LibraryPress@UF
2023-05-01
|
| Series: | Proceedings of the International Florida Artificial Intelligence Research Society Conference |
| Subjects: | |
| Online Access: | https://journals.flvc.org/FLAIRS/article/view/133326 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A Collection of Swedish Diachronic Word Embedding Models Trained on Historical Newspaper Data
by: Simon Hengchen, et al.
Published: (2021-01-01) -
Enhancing Word Embeddings for Improved Semantic Alignment
by: Julian Szymański, et al.
Published: (2024-12-01) -
Slovene and Croatian word embeddings in terms of gender occupational analogies
by: Matej Ulčar, et al.
Published: (2021-07-01) -
Word Embedding for Semantically Relative Words: an Experimental Study
by: Maria S. Karyaeva, et al.
Published: (2018-12-01) -
Advancing Arabic Word Embeddings: A Multi-Corpora Approach with Optimized Hyperparameters and Custom Evaluation
by: Azzah Allahim, et al.
Published: (2024-11-01)