Extending Embedding Representation by Incorporating Latent Relations
The semantic representation of words is a fundamental task in natural language processing and text mining. Learning word embedding has shown its power on various tasks. Most studies are aimed at generating embedding representation of a word based on encoding its context information. However, many la...
Saved in:
| Main Authors: | Gao Yang, Wang Wenbo, Liu Qian, Huang Heyan, Yuefeng Li |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2018-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/8444048/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Slovene and Croatian word embeddings in terms of gender occupational analogies
by: Matej Ulčar, et al.
Published: (2021-07-01) -
Enhancing Word Embeddings for Improved Semantic Alignment
by: Julian Szymański, et al.
Published: (2024-12-01) -
Natural language processing-based approach for automatically coding ship sensor data
by: Yunhui Kim, et al.
Published: (2024-01-01) -
Combining computational linguistics with sentence embedding to create a zero-shot NLIDB
by: Yuriy Perezhohin, et al.
Published: (2024-12-01) -
A Hybrid Semantic Representation Method Based on Fusion Conceptual Knowledge and Weighted Word Embeddings for English Texts
by: Zan Qiu, et al.
Published: (2024-11-01)