Home

gammaSL

Gammasl, short for Gamma-Stable Self-Learning, is a theoretical framework in data science and applied mathematics that combines gamma-stable stochastic processes with self-learning algorithms to model complex, non-Gaussian data. The approach emphasizes heavy tails, skewness, and non-stationarity, and is intended for streaming data and evolving systems.

In the model, latent variables are assumed to follow gamma distributions, which provides flexible dispersion and

History and development: The concept emerged in 2022 from a collaboration between researchers at the Institute

Applications: Gammasl has been explored for time-series forecasting in finance, anomaly detection in sensor networks, and

Implementation and status: As a research concept, there are experimental implementations in Python and R, but

See also: gamma distribution, gamma process, self-learning, heavy-tailed distributions.

tail
behavior.
The
observable
variables
are
generated
through
a
likelihood
informed
by
these
latent
gamma
variables.
The
inference
procedure
alternates
between
updating
latent
gamma
variables
and
adjusting
the
self-learning
component
using
online
updates,
variational
methods,
or
Bayesian
filtering.
for
Advanced
Computation
and
the
Centre
for
Stochastic
Modeling.
It
was
proposed
as
a
general-purpose
tool
for
non-stationary
environments
and
was
described
in
several
preprints.
analysis
of
astrophysical
gamma-ray
data,
where
heavy-tailed
noise
and
non-stationarity
are
common.
it
is
not
a
standard
method
and
has
not
achieved
widespread
adoption.
Public
repositories
include
sample
code
and
datasets
to
illustrate
the
learning
process.