underparameterized
Underparameterized refers to a model that has fewer parameters than the amount of training data. In machine learning, this often suggests that the model is too simple to capture the underlying complexity of the data. An underparameterized model may struggle to generalize well to new, unseen data, leading to high bias. This means the model makes strong assumptions about the data that might not be true.
In contrast, an overparameterized model has significantly more parameters than training data points. While overparameterization was
The concept of underparameterization is important for understanding model capacity and the bias-variance tradeoff. A model
Research has explored the behavior of underparameterized models in specific contexts, such as linear regression with