Advancing Model Explainability: Visual Concept Knowledge Distillation for Concept Bottleneck Models
This study explores the integration of concept bottleneck models (CBMs) with knowledge distillation (KD) while preserving the locality characteristics of the CBM. Although KD proves effective in model compression, compressed models often lack interpretability in their decision-making process. We enh...
Saved in:
| Main Authors: | Ju-Hwan Lee, Dang Thanh Vu, Nam-Kyung Lee, Il-Hong Shin, Jin-Young Kim |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-01-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/15/2/493 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Explainability and Interpretability in Concept and Data Drift: A Systematic Literature Review
by: Daniele Pelosi, et al.
Published: (2025-07-01) -
Head information bottleneck (HIB): leveraging information bottleneck for efficient transformer head attribution and pruning
by: Yukun Qian, et al.
Published: (2025-07-01) -
Enhancing Bottleneck Concept Learning in Image Classification
by: Xingfu Cheng, et al.
Published: (2025-04-01) -
Knowledge distillation with resampling for imbalanced data classification: Enhancing predictive performance and explainability stability
by: Kazuki Fujiwara
Published: (2024-12-01) -
Multilayer Concept Drift Detection Method Based on Model Explainability
by: Haolan Zhang, et al.
Published: (2024-01-01)