ReLUxp
ReLUxp is a parametric activation function proposed as a generalization of the rectified linear unit (ReLU). It is defined for a real input x and a positive shape parameter p > 0 by the expression f(x) = (max(0, x))^p. This means that for positive inputs, the output is x raised to the power p, while for non-positive inputs the function output remains zero. When p equals 1, ReLUxp reduces to the standard ReLU.
The parameter p controls the curvature of the activation on the positive side and thus influences gradient
ReLUxp is straightforward to implement in modern neural network libraries, requiring a standard max operation followed