Crowd counting at the edge using weighted knowledge distillation
Abstract Visual crowd counting has gained serious attention during the last couple of years. The consistent contributions to this topic have now solved several inherited challenges such as scale variations, occlusions, and cross-scene applications. However, these works attempt to improve accuracy an...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-04-01
|
| Series: | Scientific Reports |
| Online Access: | https://doi.org/10.1038/s41598-025-90750-5 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract Visual crowd counting has gained serious attention during the last couple of years. The consistent contributions to this topic have now solved several inherited challenges such as scale variations, occlusions, and cross-scene applications. However, these works attempt to improve accuracy and often ignore model size and computational complexity. Several practical applications employ resource-limited stand-alone devices like drones to run crowd models and require real-time inference. Though there have been some good efforts to develop lightweight shallow crowd models offering fast inference time, the relevant literature dedicated to lightweight crowd counting is limited. One possible reason is that lightweight deep-learning models suffer from accuracy degradation in complex scenes due to limited generalization capabilities. This paper addresses this important problem by proposing knowledge distillation to improve the learning capability of lightweight crowd models. Knowledge distillation enables lightweight models to emulate deeper models by distilling the knowledge learned by the deeper model during the training process. The paper presents a detailed experimental analysis with three lightweight crowd models over six benchmark datasets. The results report a clear significance of the proposed method supported by several ablation studies. |
|---|---|
| ISSN: | 2045-2322 |