underparameterization
Underparameterization is a concept in machine learning referring to models with fewer parameters than the dimensionality of the input data. In traditional statistical learning theory, such models are expected to have high bias and low variance, leading to underfitting. This means the model is too simple to capture the underlying patterns in the data. For example, trying to fit a linear model to highly non-linear data would be an instance of underparameterization.
However, recent research, particularly in deep learning, has shown that underparameterized models can sometimes exhibit surprising