Decoupled Classifier Knowledge Distillation.

Mainstream knowledge distillation methods primarily include self-distillation, offline distillation, online distillation, output-based distillation, and feature-based distillation. While each approach has its respective advantages, they are typically employed independently. Simply combining two dist...

Full description

Saved in:
Bibliographic Details
Main Authors: Hairui Wang, Mengjie Dong, Guifu Zhu, Ya Li
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0314267
Tags: Add Tag
No Tags, Be the first to tag this record!