outofdistribution
Out-of-distribution (OOD) refers to inputs that do not lie within the distribution of data used to train a machine learning model. In supervised learning, the training data are assumed to be representative of the situations the model will encounter at deployment; OOD inputs violate this assumption, potentially causing unreliable predictions, miscalibrated uncertainty estimates, and degraded performance. OOD phenomena arise from distributional shifts such as covariate shift, concept drift, domain shift, or the appearance of novel classes not seen during training.
Detection and mitigation strategies aim to identify OOD samples and reduce their impact. Common approaches include
Evaluation of OOD performance typically involves testing on in-distribution data alongside separate out-of-distribution datasets, using metrics
Challenges persist in defining what constitutes OOD in a given context, balancing detection with maintaining in-distribution