activationfunction
An activation function is a nonlinear function applied to the output of a neuron or to the weighted input plus bias in an artificial neural network. Its primary role is to introduce nonlinearity into the model, enabling it to approximate complex, non-linear relationships rather than just linear mappings. Without activation functions, a multilayer network would collapse into a single linear transform, regardless of the number of layers.
Common activation functions include the sigmoid (logistic) function, tanh, rectified linear unit (ReLU) and its variants,
In practice, activation choices depend on the network architecture and task. Hidden layers commonly use ReLU