Progressive Bitwidth Assignment Approaches for Efficient Capsule Networks Quantization
Capsule Networks (CapsNets) are a class of neural network architectures that can be used to more accurately model hierarchical relationships due to their hierarchical structure and dynamic routing algorithms. However, their high accuracy comes at the cost of significant memory and computational reso...
Saved in:
Main Authors: | Mohsen Raji, Amir Ghazizadeh Ahsaei, Kimia Soroush, Behnam Ghavami |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10854429/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Uncertainty-based quantization method for stable training of binary neural networks
by: A.V. Trusov, et al.
Published: (2024-08-01) -
Enhanced Vector Quantization for Embedded Machine Learning: A Post-Training Approach With Incremental Clustering
by: Thommas K. S. Flores, et al.
Published: (2025-01-01) -
Enhancing Image-Based JPEG Compression: ML-Driven Quantization via DCT Feature Clustering
by: Shahrzad Sabzavi, et al.
Published: (2025-01-01) -
Meaningful Multimodal Emotion Recognition Based on Capsule Graph Transformer Architecture
by: Hajar Filali, et al.
Published: (2025-01-01) -
NIGWO-iCaps NN: A Method for the Fault Diagnosis of Fiber Optic Gyroscopes Based on Capsule Neural Networks
by: Nan Lu, et al.
Published: (2025-01-01)