Home

ChangePointDetection

ChangePointDetection refers to the statistical task of identifying times when the probabilistic properties of a sequence of observations change. A change point marks a boundary between segments that are governed by different data-generating processes, such as shifts in the mean, variance, or distribution. The problem is common in time series analysis, signal processing, genomics, and finance, where timely detection is important for interpretation and action.

Approaches are generally classified as offline (retrospective) or online (real-time). Offline methods evaluate the entire sequence

Prominent techniques include CUSUM and other sequential likelihood ratio tests for detecting shifts in mean; penalized

Applications span finance for regime shifts, climate science for abrupt changes in weather patterns, genomics for

Evaluation often uses simulated data with known change points or cross-validation on historical records, and performance

to
locate
all
change
points,
while
online
methods
emit
detections
as
new
data
arrive.
Many
methods
assume
a
parametric
model
for
the
data,
but
nonparametric
techniques
are
also
used
when
distributional
form
is
uncertain.
likelihood
methods
such
as
the
Pruned
Exact
Linear
Time
(PELT)
algorithm;
and
binary
segmentation.
Bayesian
methods,
including
Bayesian
Online
Change
Point
Detection,
maintain
posterior
beliefs
about
run
length
and
detect
changes
based
on
posterior
probabilities.
Hybrid
and
multivariate
methods
extend
to
more
than
one
time
series
or
to
changes
in
variance
and
correlations.
copy
number
variation,
network
monitoring
for
traffic
anomalies,
and
quality
control
in
manufacturing.
Practical
challenges
include
choosing
the
number
of
changes,
handling
gradual
or
multiple
severities,
and
robustness
to
noise
and
model
misspecification.
is
summarized
with
detection
delay,
false
alarm
rate,
and
accuracy
of
estimated
change
points.