Softplus
Softplus is a smooth, differentiable activation function commonly used in artificial neural networks. It maps real‑valued inputs \(x\) onto positive outputs via the formula \(f(x)=\ln(1+e^x)\). As \(x\) tends to infinity, \(f(x)\) approaches \(x\), while for large negative \(x\) it saturates near zero. This smooth approximation to the rectified linear unit (ReLU) preserves mathematical properties that benefit gradient‑based optimization.
The derivative of softplus is the logistic sigmoid: \(f'(x)=1/(1+e^{-x})\), which is bounded between 0 and 1. Unlike
Practitioners often select softplus in situations where smoothness is critical, such as in probabilistic models or
The function was introduced by J. Branke in 1994 as part of a family of smoothing transformations