ReLUtype
ReLUtype is a class of activation functions used in artificial neural networks that generalizes the rectified linear unit (ReLU) by extending the negative region with a configurable shape while keeping the positive region linear. The aim is to improve gradient flow, reduce dead neurons, and allow more expressive representations without introducing heavy nonlinearity.
In its simplest form, ReLUtype defines f(x) as: f(x) = x for x ≥ 0; f(x) = g_t(x) for
Implementation and use: ReLUtype has been explored in feedforward networks, convolutional networks, and recurrent models as
See also: ReLU, Leaky ReLU, PReLU, GELU, activation function.