Sentence Embedding Generation Framework Based on Kullback–Leibler Divergence Optimization and RoBERTa Knowledge Distillation
In natural language processing (NLP) tasks, computing semantic textual similarity (STS) is crucial for capturing nuanced semantic differences in text. Traditional word vector methods, such as Word2Vec and GloVe, as well as deep learning models like BERT, face limitations in handling context dependen...
Saved in:
| Main Authors: | Jin Han, Liang Yang |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2024-12-01
|
| Series: | Mathematics |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2227-7390/12/24/3990 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence
by: Giovanni Pistone
Published: (2025-03-01) -
Kullback–Leibler Divergence‐Based Fault Detection Scheme for 100% Inverter Interfaced Autonomous Microgrids
by: Ali Mallahi, et al.
Published: (2025-06-01) -
Domain Alignment Dynamic Spectral and Spatial Feature Fusion for Hyperspectral Change Detection
by: Xuexiang Qin, et al.
Published: (2025-01-01) -
Assessing the Impact of Physical Activity on Dementia Progression Using Clustering and the MRI-Based Kullback–Leibler Divergence
by: Agnieszka Wosiak, et al.
Published: (2025-01-01) -
Enhanced Kullback–Leibler divergence based pilot protection for lines connecting battery energy storage stations
by: Yingyu Liang, et al.
Published: (2024-12-01)