SqrtReLUx
SqrtReLUx is a nonlinear activation function defined by f(x) = sqrt(max(0, x)). It applies the standard rectified linear unit (ReLU) to x and then takes the square root of the result, yielding zero for nonpositive inputs and a square-root curve for positive inputs.
For any real x, f(x) = sqrt(max(0, x)). The domain is all real numbers, and the range is
- Monotonicity and continuity: f is nondecreasing and continuous on the real line.
- Differentiability: f is differentiable for x > 0 with f'(x) = 1/(2 sqrt(x)); for x < 0, f'(x) = 0;
- Shape: the left region is flat at zero (x ≤ 0), and the right region follows a concave
- Second derivative: f''(x) = 0 for x < 0, and f''(x) = -1/(4 x^(3/2)) for x > 0.
SqrtReLUx can be viewed as the composition of ReLU with a power-1/2 transform: f(x) = (ReLU(x))^(1/2). It
In neural networks, sqrtReLU can serve as a pointwise activation with gradually diminishing gradient for large