Home

subdifferentials

Subdifferentials are a collection of subgradients that generalize the derivative to nonsmooth and nonconvex settings. In convex analysis, for a proper convex function f: R^n → R ∪ {+∞}, the subdifferential at x ∈ dom f is the set of vectors g such that f(y) ≥ f(x) + g^T(y − x) for all y ∈ R^n. Denoted ∂f(x), this set encodes all supporting hyperplanes to the epigraph of f at x. If f is differentiable at x, then ∂f(x) = {∇f(x)}; in general, ∂f(x) may contain multiple elements and is nonempty for x in the interior of the domain of f.

Example: f(x) = |x| on R. Then ∂f(0) = [−1, 1], while ∂f(x) = {sign(x)} for x ≠ 0.

For nonconvex or nondifferentiable settings, several generalized subdifferentials extend the notion. The Clarke subdifferential ∂^C f(x)

Applications include optimality conditions and algorithms. For convex f, x* minimizes f iff 0 ∈ ∂f(x*). For

(also
written
Ďf(x))
is
defined
for
locally
Lipschitz
f
and
equals
the
convex
hull
of
all
limit
points
of
∇f(x_k)
as
x_k
→
x
with
f
differentiable
at
x_k.
The
limiting
(Mordukhovich)
subdifferential
∂^−
f(x)
is
obtained
by
a
related
limiting
process
and
is
useful
in
variational
analysis.
Proximal
subdifferentials
arise
from
quadratic
penalties
and
play
a
key
role
in
proximal
methods.
locally
Lipschitz
f,
a
necessary
condition
for
a
local
minimum
is
0
∈
∂^C
f(x*).
Subdifferentials
underpin
many
methods
in
convex
and
nonsmooth
optimization,
such
as
subgradient
and
proximal
algorithms.