Undertraining
Undertraining is a term used in machine learning to describe a situation in which a model has not learned sufficiently from the training data, leading to poor performance on both the training set and unseen data. It is closely related to underfitting, which is the broader concept of a model being unable to capture the underlying patterns of the data. Undertraining can arise from limited data, insufficient training time, or model limitations that prevent learning.
Causes and signs include too few training epochs, overly aggressive regularization, insufficient model capacity, poor data
Remedies involve increasing model capacity (more layers or units), reducing regularization, increasing training time, tuning learning
Notes on terminology: terminology usage varies; many practitioners prefer underfitting to describe the core problem. The