DM-KD: Decoupling Mixed-Images for Efficient Knowledge Distillation

Knowledge distillation (KD) is a method of model compression. It involves extracting valuable knowledge from a high-performance and high-capacity teacher model and transferring this knowledge to a target student model having relatively small capacity. However, we discover that naively applying mixed...

Full description

Saved in:
Bibliographic Details
Main Authors: Jongkyung Im, Younho Jang, Junpyo Lim, Taegoo Kang, Chaoning Zhang, Sung-Ho Bae
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10819346/
Tags: Add Tag
No Tags, Be the first to tag this record!