Decoupled Classifier Knowledge Distillation.
Mainstream knowledge distillation methods primarily include self-distillation, offline distillation, online distillation, output-based distillation, and feature-based distillation. While each approach has its respective advantages, they are typically employed independently. Simply combining two dist...
Saved in:
| Main Authors: | Hairui Wang, Mengjie Dong, Guifu Zhu, Ya Li |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Public Library of Science (PLoS)
2025-01-01
|
| Series: | PLoS ONE |
| Online Access: | https://doi.org/10.1371/journal.pone.0314267 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Decoupled Time-Dimensional Progressive Self-Distillation With Knowledge Calibration for Edge Computing-Enabled AIoT
by: Yingchao Wang, et al.
Published: (2024-01-01) -
A Lightweight and Small Sample Bearing Fault Diagnosis Algorithm Based on Probabilistic Decoupling Knowledge Distillation and Meta-Learning
by: Hao Luo, et al.
Published: (2024-12-01) -
Enhance Social Network Bullying Detection Using Multi-Teacher Knowledge Distillation With XGBoost Classifier
by: Sathit Prasomphan
Published: (2025-01-01) -
Timestamp-Guided Knowledge Distillation for Robust Sensor-Based Time-Series Forecasting
by: Jiahe Yan, et al.
Published: (2025-07-01) -
Aeroengine Remaining Life Prediction Using Feature Selection and Improved SE Blocks
by: Hairui Wang, et al.
Published: (2024-01-01)