DM-KD: Decoupling Mixed-Images for Efficient Knowledge Distillation
Knowledge distillation (KD) is a method of model compression. It involves extracting valuable knowledge from a high-performance and high-capacity teacher model and transferring this knowledge to a target student model having relatively small capacity. However, we discover that naively applying mixed...
Saved in:
Main Authors: | Jongkyung Im, Younho Jang, Junpyo Lim, Taegoo Kang, Chaoning Zhang, Sung-Ho Bae |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10819346/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Effects of different wet distillers’ grains ratios on fermentation quality, nitrogen fractions and bacterial communities of total mixed ration silage
by: Ermei Du, et al.
Published: (2025-01-01) -
Knowledge Distillation in Object Detection for Resource-Constrained Edge Computing
by: Arief Setyanto, et al.
Published: (2025-01-01) -
Adopting augmented reality into retailing mix strategy: Generation Z’s perspective in Egypt
by: Marwa Mahmoud Ibrahim, et al.
Published: (2025-02-01) -
SIMULATION AND OPTIMIZATION AS A TOOL FOR THE DEVELOPMENT OF HIGH EFFECTIVE TECHNOLOGICAL SCHEMES OF DISTILLATION
by: A. V. Timoshenko, et al.
Published: (2017-06-01) -
Design of stepped monopole antennas with a novel decoupling structure based on characteristic mode analysis
by: Myeong-Jun Kang, et al.
Published: (2025-01-01)