Home

BCJR

BCJR, named after Bahl, Cocke, Jelinek, and Raviv, is a maximum a posteriori (MAP) decoder for convolutional codes. It computes the posterior probability of each information bit given the received sequence, enabling soft-decision decoding that is optimal under the assumed channel model.

The algorithm operates on a trellis with two recursive passes. A forward recursion computes alpha(s_i) = P(received

BCJR is a soft-input soft-output (SISO) decoder and a central component in iterative decoding, notably in turbo

Complexity grows with the trellis size (constraint length) and data length, and it relies on accurate channel

up
to
i,
state
s_i);
a
backward
recursion
computes
beta(s_i)
=
P(received
after
i,
state
s_i).
For
each
trellis
transition,
a
branch
metric
gamma(b)
is
derived
from
the
channel
observations.
The
bit
posterior
is
proportional
to
the
sum
over
paths
of
alpha
·
gamma
·
beta;
in
practice,
log-domain
LLRs
are
used
for
numerical
stability.
codes
and
turbo
equalization.
It
is
the
trellis-based
analogue
of
the
forward–backward
algorithm
for
hidden
Markov
models.
Variants,
such
as
Max-Log-MAP,
reduce
complexity
at
the
cost
of
accuracy.
statistics.
Despite
higher
cost
than
hard-decision
decoders,
BCJR
remains
a
standard
for
near-optimal
decoding
of
convolutional
codes
and
underpins
many
modern
coding
schemes.