Convolutional Neural Network Compression via Dynamic Parameter Rank Pruning
While Convolutional Neural Networks (CNNs) excel at learning complex latent-space representations, their over-parameterization can lead to overfitting and reduced performance, particularly with limited data. This, alongside their high computational and memory demands, limits the applicability of CNN...
Saved in:
Main Authors: | Manish Sharma, Jamison Heard, Eli Saber, Panagiotis Markopoulos |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10851278/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
SymbolNet: neural symbolic regression with adaptive dynamic pruning for compression
by: Ho Fung Tsoi, et al.
Published: (2025-01-01) -
A low functional redundancy-based network slimming method for accelerating deep neural networks
by: Zheng Fang, et al.
Published: (2025-04-01) -
Proposal of an objective formula-based model for equitable ranking of veterinary colleges
by: Robert M. Gogal, et al.
Published: (2025-02-01) -
Participation in international un iversity rankings as a factor of improving the quality of teaching and learning
by: Sergey V. Ablameyko, et al.
Published: (2022-05-01) -
Low-Rank Adaptation of Pre-Trained Large Vision Models for Improved Lung Nodule Malignancy Classification
by: Benjamin P. Veasey, et al.
Published: (2025-01-01)