Home

Prox

Prox, short for proximal operator, is a concept in convex analysis and optimization. It is widely used in iterative methods to minimize functions that are convex but may be non-smooth.

For a convex function f: R^n -> R ∪ {+∞}, the proximal operator is defined by prox_f(x) = argmin_y { f(y)

Proximal operators enable splitting algorithms such as the proximal gradient method and ADMM, which minimize composite

Common closed-form proximals include prox_{λ ||·||_1}(v) = soft-thresholding, prox_{δ_C}(v) as the projection onto a convex set C

Applications of proximal operators span sparse recovery, compressed sensing, image denoising, and regularized learning. In practice,

+
(1/2)
||
y
-
x
||^2
}.
Scaled
versions
prox_{λ
f}
use
(1/(2λ))
in
place
of
(1/2).
The
operator
is
firmly
nonexpansive
and
1-Lipschitz;
Moreau
decomposition
relates
prox_f
to
the
convex
conjugate
f*
via
x
=
prox_f(x)
+
prox_{f*}(x).
objective
functions
g(x)
+
h(x)
by
alternating
gradient
steps
on
the
smooth
part
g
and
proximal
updates
on
the
non-smooth
part
h.
(where
δ_C
is
the
indicator
function
of
C),
and
prox_{λ
||·||_*}(V)
as
singular
value
thresholding
for
the
nuclear
norm.
For
many
functions,
the
proximal
step
requires
an
auxiliary
optimization,
but
many
standard
regularizers
have
efficient
forms.
the
proximal
operator
is
a
fundamental
building
block
in
modern
convex
optimization
and
variational
problems,
enabling
efficient
handling
of
non-smooth
terms
in
large-scale
problems.
See
also
proximal
mapping,
Moreau
envelope,
proximal
gradient
method,
and
ADMM.