Discrepancybased
Discrepancybased is a descriptor used in statistics and machine learning to denote methods and approaches that rely on quantifying and minimizing discrepancy — a measure of difference — between two or more distributions, datasets, or predictive outcomes. The central idea is to align a model, dataset, or decision boundary with a reference distribution by reducing the discrepancy metric to improve generalization, fairness, or performance.
Common discrepancy measures include Maximum Mean Discrepancy (MMD), Wasserstein distance, Kullback–Leibler divergence, and other statistical distances.
Applications span domain adaptation, where a model trained on a source domain is adapted to a target
Advantages include a flexible, principled way to handle distributional differences without relying solely on labels, and
See also: statistical distance, domain adaptation, distribution shift, optimal transport, hypothesis testing.