corrg
Corrg is a term used in theoretical and applied machine learning to denote a family of gradient-based optimization methods that incorporate a correlation-based regularization term into the objective function. In this framework, the regularization term is designed to bias the learning process toward directions in parameter space that produce outputs with a specified level of correlation to a secondary signal, such as a label, proxy variable, or auxiliary task. The idea is to encourage representations that align with meaningful variation while suppressing spurious factors only weakly related to the target.
The term corrg is not tied to a single canonical algorithm. Descriptions vary: some define the term
In usage, corrg aims to improve generalization in settings where the target variable shares structure with
See also: regularization, gradient descent, representation learning, multitask learning.