Home

undertrained

Undertrained is a term used to describe models or systems that have not been trained sufficiently to learn the patterns in their data, resulting in suboptimal performance. In machine learning, undertraining often leads to underfitting: the model cannot capture the underlying structure, producing simple or biased predictions on both training and new data.

Causes of undertraining include insufficient data, too few training iterations or epochs, an inappropriate learning rate,

Indicators of undertraining include high training error, low or stagnant validation accuracy, and predictions that fail

Mitigation strategies involve increasing training data or data quality, allowing more training time, adjusting model capacity

Outside machine learning, the term can describe people or systems that have not received sufficient training,

excessive
regularization,
and
limited
model
capacity.
Poor
data
quality,
noisy
labels,
or
unrepresentative
training
sets
can
also
hinder
learning
and
leave
the
model
undertrained
even
after
long
training.
to
differentiate
between
inputs.
The
consequence
is
poor
generalization
and
consistently
weak
performance
across
tasks,
especially
on
unseen
data.
(for
example,
adding
layers
or
units),
and
tuning
hyperparameters
such
as
the
learning
rate
and
batch
size.
Reducing
regularization
if
it
is
too
strong
can
help,
as
can
data
augmentation,
transfer
learning,
and
cross-validation
to
guide
parameter
selection.
Monitoring
learning
curves
is
important
to
detect
undertraining
early
and
distinguish
it
from
other
issues
such
as
overfitting.
leading
to
lower
readiness
or
competence
in
their
roles.
See
also
underfitting,
learning
curves,
data
augmentation,
and
transfer
learning.