importanceweighted
Importance weighting, sometimes written as importanceweighted, is a technique used in statistics and machine learning to correct for differences between the distribution that generated data and the distribution with respect to which predictions or expectations are desired. It relies on weighting each observed example by an importance weight that accounts for how likely the example would be under the target distribution compared with the sampling distribution. In a common form, if X is distributed according to a target distribution P and is drawn from a proposal distribution Q, the weight for a sample x is w(x) = p(x)/q(x). For any function f, the expectation under P can be estimated as E_P[f(X)] = E_Q[w(X) f(X)].
Applications include correcting covariate shift in supervised learning, where training and test inputs come from different
Techniques to manage variance and degeneracy of weights include self-normalized importance weighting, where weights are normalized
Related concepts include Monte Carlo integration, covariate shift, off-policy evaluation, and domain adaptation. The approach is