-
1
Leveraging logit uncertainty for better knowledge distillation
Published 2024-12-01Subjects: “…Knowledge distillation…”
Get full text
Article -
2
Confidence-Based Knowledge Distillation to Reduce Training Costs and Carbon Footprint for Low-Resource Neural Machine Translation
Published 2025-07-01Subjects: “…knowledge distillation…”
Get full text
Article -
3
Autocorrelation Matrix Knowledge Distillation: A Task-Specific Distillation Method for BERT Models
Published 2024-10-01Subjects: Get full text
Article -
4
Pseudo Multi-Modal Approach to LiDAR Semantic Segmentation
Published 2024-12-01Subjects: Get full text
Article -
5
Optimizing Deep Learning Models for Resource‐Constrained Environments With Cluster‐Quantized Knowledge Distillation
Published 2025-05-01Subjects: Get full text
Article -
6
A Review of Knowledge Distillation in Object Detection
Published 2025-01-01Subjects: Get full text
Article -
7
Code summarization based on large model knowledge distillation
Published 2025-08-01Subjects: “…code summarization|large model|knowledge distillation…”
Get full text
Article -
8
Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation
Published 2020-01-01Subjects: Get full text
Article -
9
Prune and Distill: A Novel Knowledge Distillation Method for GCNs-Based Recommender Systems
Published 2025-01-01Subjects: Get full text
Article -
10
Knowledge Distillation for Face Recognition Using Synthetic Data With Dynamic Latent Sampling
Published 2024-01-01Subjects: Get full text
Article -
11
Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection
Published 2025-01-01Subjects: Get full text
Article -
12
Optimal Knowledge Distillation through Non-Heuristic Control of Dark Knowledge
Published 2024-08-01Subjects: Get full text
Article -
13
Predicting Subsurface Layer Thickness and Seismic Wave Velocity Using Deep Learning: Knowledge Distillation Approach
Published 2025-01-01Subjects: Get full text
Article -
14
Knowledge distillation with resampling for imbalanced data classification: Enhancing predictive performance and explainability stability
Published 2024-12-01Subjects: Get full text
Article -
15
Knowledge Distillation with Geometry-Consistent Feature Alignment for Robust Low-Light Apple Detection
Published 2025-08-01Subjects: Get full text
Article -
16
M3AE-Distill: An Efficient Distilled Model for Medical Vision–Language Downstream Tasks
Published 2025-07-01Subjects: Get full text
Article -
17
Fully Quantized Neural Networks for Audio Source Separation
Published 2024-01-01Subjects: Get full text
Article -
18
A Malware Classification Method Based on Knowledge Distillation and Feature Fusion
Published 2025-01-01Subjects: Get full text
Article -
19
Exploring Synergy of Denoising and Distillation: Novel Method for Efficient Adversarial Defense
Published 2024-11-01Subjects: Get full text
Article -
20
Advancing Model Explainability: Visual Concept Knowledge Distillation for Concept Bottleneck Models
Published 2025-01-01Subjects: Get full text
Article