Feature fusion-based collaborative learning for knowledge distillation
Deep neural networks have achieved a great success in a variety of applications, such as self-driving cars and intelligent robotics. Meanwhile, knowledge distillation has received increasing attention as an effective model compression technique for training very efficient deep models. The performanc...
Saved in:
Main Authors: | Yiting Li, Liyuan Sun, Jianping Gou, Lan Du, Weihua Ou |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-11-01
|
Series: | International Journal of Distributed Sensor Networks |
Online Access: | https://doi.org/10.1177/15501477211057037 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Real world federated learning with a knowledge distilled transformer for cardiac CT imaging
by: Malte Tölle, et al.
Published: (2025-02-01) -
Non‐Autoregressive Translation Algorithm Based on LLM Knowledge Distillation in English Corpus
by: Fang Ju, et al.
Published: (2025-01-01) -
Knowledge Distillation in Object Detection for Resource-Constrained Edge Computing
by: Arief Setyanto, et al.
Published: (2025-01-01) -
DM-KD: Decoupling Mixed-Images for Efficient Knowledge Distillation
by: Jongkyung Im, et al.
Published: (2025-01-01) -
Few-Shot Graph Anomaly Detection via Dual-Level Knowledge Distillation
by: Xuan Li, et al.
Published: (2025-01-01)