ReLUxk
ReLUxk is a family of activation functions used in artificial neural networks, extending the standard rectified linear unit (ReLU) by introducing a nonlinear exponent parameter k > 0. The common mathematical form is f_k(x) = (max(0, x))^k. When k = 1, this reduces to ReLU. For x > 0 it yields x^k, and for x ≤ 0 it yields 0.
Behavior and gradients: For k > 1, small positive inputs have near-zero gradients since the derivative is
Implementation and stability: The function is implemented with a standard max(0, x) operation followed by a
Usage and context: ReLUxk is discussed as a lightweight parametric alternative to ReLU that allows tuning of
See also: ReLU, Leaky ReLU, PReLU, Softmax, activation functions.
---