Regulariseerimine
Regulariseerimine is a term often encountered in statistics and machine learning, referring to techniques used to prevent overfitting. Overfitting occurs when a model learns the training data too well, including its noise and specific characteristics, leading to poor performance on unseen data. Regularisation introduces a penalty term to the model's loss function, discouraging overly complex models.
The most common forms of regularisation are L1 and L2 regularisation, also known as Lasso and Ridge
Another form of regularisation is dropout, primarily used in neural networks. During training, dropout randomly deactivates