Home

PDHG

PDHG, or primal-dual hybrid gradient method, is a first-order method for convex optimization and saddle-point problems. It is also known as the Chambolle-Pock algorithm in certain formulations. It is designed to solve problems of the form min_x f(x) + g(Kx), where f and g are proper convex lower semicontinuous functions and K is a linear operator. The algorithm iteratively updates primal and dual variables via proximal steps with respect to f and the convex conjugate g*, often with an over-relaxation parameter.

A common iteration is: initialize x^0 and y^0, choose step sizes τ > 0 and σ > 0 with τσ ||K||^2

PDHG is related to other splitting methods: it is a primal-dual variant of coordinate-descent and is closely

Applications include image processing (total variation denoising and deconvolution), compressed sensing, sparse recovery, and machine learning

<
1.
Then
for
k
=
0,
1,
...
compute
y^{k+1}
=
prox_{σ
g*}(
y^k
+
σ
K
x^k
),
x^{k+1}
=
prox_{τ
f}(
x^k
−
τ
K^T
y^{k+1}
),
and
optionally
bar{x}^{k+1}
=
x^{k+1}
+
θ
(x^{k+1}
−
x^k)
with
θ
∈
[0,
1].
Prox_{h}
denotes
the
proximal
operator
of
h.
Convergence
is
guaranteed
under
standard
convexity
assumptions;
the
method
achieves
O(1/k)
ergodic
convergence
for
convex
objectives
and
often
performs
well
in
practice
for
non-smooth
problems.
related
to
the
Chambolle-Pock
algorithm
and
to
ADMM
under
certain
reformulations.
Variants
include
adaptive
step
sizes,
preconditioning,
and
stochastic
or
block
updates.
problems
that
fit
a
linear
operator
K
with
nonsmooth
penalties.