LeakyReLU
LeakyReLU is a nonlinear activation function used in artificial neural networks, introduced as a variant of the rectified linear unit (ReLU). It is defined as f(x) = x if x > 0, and f(x) = alpha * x if x <= 0, where alpha is a small positive constant, commonly around 0.01.
The function aims to address the dying ReLU problem, where neurons can become inactive if many inputs
Typical values for alpha range from 0.01 to 0.3, with 0.01 being a common default. LeakyReLU is
Variants and related concepts include Parametric ReLU (PReLU), where alpha is a learnable parameter, allowing the