AReLU: Agile Rectified Linear Unit for Improving Lightweight Convolutional Neural Networks
Dynamic activation functions usually gain remarkable improvements for neural networks. Dynamic activation functions depending on input features show better performance than the input-independents. But the improvements are achieved with extra memory and computational cost, which is non-negligible for...
Saved in:
Main Authors: | Fu Chen, Yepeng Guan |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10843665/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Feature Representations Using the Reflected Rectified Linear Unit (RReLU) Activation
by: Chaity Banerjee, et al.
Published: (2020-06-01) -
A Lightweight CNN-Transformer Implemented via Structural Re-Parameterization and Hybrid Attention for Remote Sensing Image Super-Resolution
by: Jie Wang, et al.
Published: (2024-12-01) -
Identification of Buffalo Breeds Using Self-Activated-Based Improved Convolutional Neural Networks
by: Yuanzhi Pan, et al.
Published: (2022-09-01) -
Filtering Approaches and Mish Activation Function Applied on Handwritten Chinese Character Recognition
by: Zhong Yingna, et al.
Published: (2024-11-01) -
Global Universality of the Two-Layer Neural Network with the k-Rectified Linear Unit
by: Naoya Hatano, et al.
Published: (2024-01-01)