Home

corrg

Corrg is a term used in theoretical and applied machine learning to denote a family of gradient-based optimization methods that incorporate a correlation-based regularization term into the objective function. In this framework, the regularization term is designed to bias the learning process toward directions in parameter space that produce outputs with a specified level of correlation to a secondary signal, such as a label, proxy variable, or auxiliary task. The idea is to encourage representations that align with meaningful variation while suppressing spurious factors only weakly related to the target.

The term corrg is not tied to a single canonical algorithm. Descriptions vary: some define the term

In usage, corrg aims to improve generalization in settings where the target variable shares structure with

See also: regularization, gradient descent, representation learning, multitask learning.

in
the
context
of
standard
gradient
descent
with
an
added
penalty
proportional
to
the
correlation
between
predictions
and
a
guiding
signal;
others
describe
it
as
a
general
regularization
principle
that
can
be
instantiated
with
different
correlation
measures
or
optimization
schemes.
Because
of
this
variability,
practical
implementations
differ
in
how
the
correlation
is
estimated,
what
signals
are
used,
and
how
hyperparameters
are
configured.
nuisance
factors
or
where
robust
transfer
to
related
tasks
is
desired.
Potential
drawbacks
include
added
computational
overhead
and
sensitivity
to
the
choice
of
guiding
signal,
which
can
lead
to
underfitting
if
the
correlation
is
mis-specified.