Home

forwardback

Forwardback refers to the forward-backward algorithm, a dynamic programming method used to compute the posterior probabilities of hidden states in sequential probabilistic models such as hidden Markov models (HMMs) and related dynamic Bayesian networks. The method relies on two passes over an observation sequence: a forward pass that aggregates information from the start up to each time step, and a backward pass that propagates information from the end back to each time step. By combining the results of these passes, the algorithm yields the smoothed probability of each state at each time given the entire sequence of observations.

In practice, the forward-backward algorithm computes, for each time and state, the probability of being in that

Applications include speech recognition, biological sequence analysis, part-of-speech tagging, and financial time-series analysis. The algorithm is

Originated in the study of HMMs in the late 1980s, the forward-backward approach remains a foundational technique

See also: forward algorithm, backward algorithm, Baum-Welch, hidden Markov model.

state
given
all
observations.
These
smoothed
probabilities
are
used
for
inference
and
for
learning
model
parameters.
The
learning
use,
Baum-Welch,
treats
the
forward-backward
results
as
expected
sufficient
statistics
to
re-estimate
transition
and
emission
probabilities
in
an
expectation-maximization
framework.
valued
for
exact
inference
in
HMMs,
though
it
can
be
computationally
intensive
for
large
state
spaces,
with
time
complexity
proportional
to
the
product
of
the
number
of
states
and
sequence
length.
In
practice,
scaling
or
log-domain
implementations
are
used
to
avoid
numerical
underflow.
in
probabilistic
sequence
modeling.
The
term
forwardback
is
sometimes
used
informally
or
in
software
naming
as
a
shorthand
for
this
approach,
though
most
literature
uses
“forward-backward
algorithm.”