Softsign
Softsign is a smooth activation function used in neural networks. It is defined by f(x) = x / (1 + |x|). The function maps the entire real line to the interval (-1, 1) and is monotone increasing, with f(0) = 0. The derivative is f'(x) = 1 / (1 + |x|)^2, which is positive for all x and tends to zero as |x| grows, ensuring a saturating behavior for large inputs but with a relatively gentle gradient decay compared to some other saturating activations.
Properties of softsign include its smoothness and differentiability at all points, including zero. Because it is
In practice, softsign has been explored as an alternative to tanh or sigmoid activations in both feedforward