Elus
ELUs, short for Exponential Linear Units, are a family of activation functions used in artificial neural networks. An ELU with parameter alpha > 0 is defined as f(x) = x for x > 0 and f(x) = alpha*(exp(x) - 1) for x <= 0. The function is continuous and differentiable for all x. Common practice uses alpha = 1.0, though alpha can be treated as a fixed hyperparameter or learned during training.
Compared with the rectified linear unit (ReLU), ELUs produce negative outputs, which helps reduce bias shifts
Limitations include higher computational cost due to the exponential term and potential sensitivity to the alpha
Beyond neural networks, the acronym ELU has other meanings in different fields. This article focuses on the