Home

ForwardBackward

Forwardbackward is a term used in different areas of applied mathematics and computer science, most often referring to two related algorithmic ideas that involve a forward (explicit) step followed by a backward (implicit/proximal) step.

In probabilistic sequence modeling, the forward-backward algorithm computes posterior marginals in hidden Markov models given an

In convex optimization, forward-backward splitting, also known as proximal gradient, solves min_x f(x) + g(x) where f

observation
sequence.
It
comprises
two
passes:
the
forward
pass,
where
α_t(i)
=
P(y_1..y_t,
x_t
=
i),
and
the
backward
pass,
where
β_t(i)
=
P(y_{t+1}..y_T
|
x_t
=
i).
The
state
marginal
P(x_t
=
i
|
y)
is
proportional
to
α_t(i)β_t(i).
The
algorithm
is
used
for
smoothing
and,
within
EM,
for
re-estimating
model
parameters.
Its
time
complexity
is
linear
in
the
sequence
length
per
iteration,
O(NT),
with
N
states
and
T
observations.
Applications
include
speech
recognition,
bioinformatics,
and
natural
language
processing.
has
a
Lipschitz-continuous
gradient
and
g
is
convex
(possibly
nonsmooth).
The
iteration
is
x^{k+1}
=
prox_{γ
g}(
x^k
−
γ
∇f(x^k)
),
with
step
size
γ
in
(0,
2/L).
This
framework
generalizes
gradient
methods
to
handle
non-smooth
regularizers
such
as
L1
or
total
variation.
It
is
widely
used
in
imaging,
compressed
sensing,
and
machine
learning,
and
has
many
variants,
including
accelerated
(FISTA)
and
stochastic
forms.
The
term
forward-backward
reflects
the
explicit
gradient
step
(forward)
and
the
proximal/backward
step.