subgradienter
Subgradienter is a term used in optimization to denote an algorithm or practitioner that uses subgradients to minimize a function, especially when the function is convex and possibly nondifferentiable. The idea is to replace the gradient with a subgradient, which provides a linear lower bound on the function.
In convex analysis, a vector g is a subgradient of a function f at x if f(y)
A typical subgradienter implements an iterative update such as x_{k+1} = x_k − α_k g_k, where g_k ∈ ∂f(x_k)
Convergence properties rely on convexity and step-size rules. For convex f with diminishing steps satisfying ∑ α_k
Applications of subgradient-based approaches include large-scale machine learning problems, regularized optimization (such as L1 penalties), and