subgradin
Subgradin is a term sometimes used as a variant spelling for subgradient, a concept in convex analysis that generalizes the gradient to non-differentiable functions. It plays a central role in convex optimization by describing linear underestimators of a function at a given point.
Formally, let f: R^n → R be convex. A vector g is a subgradient of f at x
Subgradient methods are iterative optimization procedures that use elements of the subdifferential. Starting from x0, one
Examples help illustrate the concept. For f(x) = |x| on the real line, ∂f(x) = {sign(x)} for x ≠
Extensions include Clarke’s subdifferential for certain nonconvex, locally Lipschitz functions, which broadens the idea of a