Home

fluctuus

Fluctuus is a term encountered in theoretical and pedagogical discussions of time-series variability. Used as a metric for the magnitude of short-term fluctuations in a dynamical quantity, fluctuus is typically defined with reference to a moving or local mean over a chosen time window.

Definition and formulations:

In a continuous-time series x(t), with a local mean μ_T(t) computed over a window of width T,

Properties:

Fluctus is nonnegative, depends on the scale of observation, and is influenced by the method of estimating

Applications:

In educational settings, fluctuus is used to illustrate how fluctuation magnitude depends on window length. In

History:

The term fluctuus is not part of standard nomenclature; it appears in some introductory texts and thought

See also:

Fluctuation, variance, standard deviation, volatility, time-series analysis.

fluctuus
may
be
defined
as
F_T
=
sqrt(
(1/T)
∫_{t}^{t+T}
[x(s)
-
μ_T(s)]^2
ds
).
In
discrete
time,
F_T
≈
sqrt(
(1/N)
∑_{n}
[x_n
-
μ_T(n)]^2
),
where
N
is
the
number
of
samples
in
the
window.
Alternative
definitions
use
L1
norms
such
as
F_T
=
(1/T)
∑
|x_n
-
μ_T(n)|.
The
choice
of
T
shapes
the
value
and
makes
fluctuus
sensitive
to
the
time
scale
of
interest.
the
local
mean.
It
increases
with
the
amplitude
of
fluctuations
and
decreases
as
the
signal
becomes
smoother.
It
is
not
invariant
to
all
transformations
and
is
therefore
contrastive
across
datasets
with
different
units
or
scales.
stylized
models
of
climate,
finance,
or
biology,
it
serves
as
a
teaching
proxy
for
volatility
and
can
be
compared
across
datasets
to
assess
relative
instability.
experiments
as
a
hypothetical
metric.