underregularization
Underregularization is a term used in statistics and machine learning to describe a situation in which a model is not sufficiently constrained by regularization. When the penalty for complexity is too small or absent, the model can fit noise in the training data, leading to high variance and poor generalization. It is the opposite of overregularization; while overregularization can cause underfitting, underregularization tends to overfit.
Common causes include choosing a very small regularization strength (for example, a low lambda in ridge or
Consequences include low training error but high validation or test error, indicating overfitting. Coefficients or weights
Detection methods include comparing learning curves for training and validation sets, monitoring the gap between train
Mitigation strategies involve increasing the regularization strength or selecting a more appropriate penalty (L1, L2, elastic