Home

componentsautoregression

Components autoregression is a time series modeling approach that represents a multivariate signal as a set of latent components, each evolving autoregressively. The observed series are formed by a mixing of these components, so temporal dependence can be captured within components rather than across observed series.

Formally, x_t = A s_t + e_t, with x_t in R^d, s_t in R^p, A a mixing matrix, and

Estimation typically involves a decomposition step (for example independent component analysis or probabilistic factor models) to

Applications include signal processing, econometrics, and neuroscience, where separating independent sources before modeling improves interpretability and

Related topics include autoregressive models, independent component analysis, dynamic factor models, and vector autoregression.

e_t
observation
noise.
Each
latent
component
s_t^i
follows
s_t^i
=
sum_{k=1}^{p_i}
phi_{i,k}
s_{t-k}^i
+
epsilon_t^i.
The
parameters
include
A,
the
AR
coefficients
phi,
and
the
noise
covariances.
obtain
s_t
and
A,
followed
by
fitting
AR
models
to
each
component.
Some
approaches
estimate
all
parameters
jointly
via
likelihood
or
Bayesian
methods.
can
reduce
complexity.
Limitations
include
identifiability
of
the
latent
components,
sensitivity
to
noise,
and
nonstationarity.