A dynamic dropout self-distillation method for object segmentation
Abstract There is a phenomenon that better teachers cannot teach out better students in knowledge distillation due to the capacity mismatch. Especially in pixel-level object segmentation, there are some challenging pixels that are difficult for the student model to learn. Even if the student model l...
Saved in:
Main Authors: | Lei Chen, Tieyong Cao, Yunfei Zheng, Yang Wang, Bo Zhang, Jibin Yang |
---|---|
Format: | Article |
Language: | English |
Published: |
Springer
2024-12-01
|
Series: | Complex & Intelligent Systems |
Subjects: | |
Online Access: | https://doi.org/10.1007/s40747-024-01705-8 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Knowledge Distillation in Object Detection for Resource-Constrained Edge Computing
by: Arief Setyanto, et al.
Published: (2025-01-01) -
TPDTNet: Two-Phase Distillation Training for Visible-to-Infrared Unsupervised Domain Adaptive Object Detection
by: Siyu Wang, et al.
Published: (2025-01-01) -
Dropout rates and reasons for dropout among patients receiving clozapine
by: Sandeep Grover, et al.
Published: (2023-06-01) -
Reasons for school dropouts in suburban areas near Villupuram district: A retrospective study
by: D. Arthi, et al.
Published: (2024-01-01) -
A novel knowledge distillation framework for enhancing small object detection in blurry environments with unmanned aerial vehicle-assisted images
by: Sayed Jobaer, et al.
Published: (2024-12-01)