overparametrisering
Overparametrisering is a term used primarily in the field of machine learning and statistical modeling to describe a situation where a model has more parameters than can be justified by the amount of data available for training. In simpler terms, it refers to the use of a model that is too complex for the given dataset, leading to potential issues in the model's performance and generalization.
When a model is overparameterized, it may fit the training data extremely well, capturing not only the
The problem of overparametrisering is particularly relevant in high-dimensional settings, where the number of features (and
In recent years, the concept of overparametrisering has been revisited in the context of deep learning, where