Safe Semi-Supervised Contrastive Learning Using In-Distribution Data as Positive Examples
Semi-supervised learning (SSL) methods have shown promising results in solving many practical problems when only a few labels are available. The existing methods assume that the class distributions of labeled and unlabeled data are equal; however, their performances are significantly degraded in cla...
Saved in:
| Main Authors: | Min Gu Kwak, Hyungu Kahng, Seoung Bum Kim |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11016683/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
PC-Match: Semi-Supervised Learning With Progressive Contrastive and Consistency Regularization
by: Mikyung Kang, et al.
Published: (2025-01-01) -
Separated and Independent Contrastive Semi-Supervised Learning for Imbalanced Datasets
by: Dongyoung Kim, et al.
Published: (2025-01-01) -
Semi-Supervised Burn Depth Segmentation Network with Contrast Learning and Uncertainty Correction
by: Dongxue Zhang, et al.
Published: (2025-02-01) -
A deep semi-supervised learning approach to the detection of glaucoma on out-of-distribution retinal fundus image datasets
by: Lei Wang, et al.
Published: (2025-05-01) -
Proposal of self and semi-supervised learning for imbalanced classification of coronary heart disease tabular data
by: Danny Xie-Li, et al.
Published: (2024-09-01)