Global Universality of the Two-Layer Neural Network with the k-Rectified Linear Unit
This paper concerns the universality of the two-layer neural network with the k-rectified linear unit activation function with k=1,2,… with a suitable norm without any restriction on the shape of the domain in the real line. This type of result is called global universality, which extends the previo...
Saved in:
Main Authors: | Naoya Hatano, Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2024-01-01
|
Series: | Journal of Function Spaces |
Online Access: | http://dx.doi.org/10.1155/2024/3262798 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
AReLU: Agile Rectified Linear Unit for Improving Lightweight Convolutional Neural Networks
by: Fu Chen, et al.
Published: (2025-01-01) -
Feature Representations Using the Reflected Rectified Linear Unit (RReLU) Activation
by: Chaity Banerjee, et al.
Published: (2020-06-01) -
A perturbation-based model for rectifier circuits
by: Vipin B. Vats, et al.
Published: (2006-01-01) -
Overexpression of Delayed Rectifier K+ Channels Promotes In situ Proliferation of Leukocytes in Rat Kidneys with Advanced Chronic Renal Failure
by: Itsuro Kazama, et al.
Published: (2012-01-01) -
RECTIFYING THE UNDERSTANDING AGAINST THE CONCEPT OF ‘ADALAH AL-SAHABAH
by: Muhammad Mutiullah
Published: (2023-08-01)