fdivergences
F-divergences, or f-divergences, are a family of statistical distances between probability measures. Let (X, F) be a measurable space and P, Q probability measures on it with P absolutely continuous with respect to Q. Let f: (0, ∞) → R be a convex function satisfying f(1) = 0. The f-divergence of P from Q is defined by D_f(P || Q) = ∫ f(dP/dQ) dQ = E_Q[f(dP/dQ)]. In the discrete case, with probability mass functions p and q, D_f(P||Q) = ∑ q(x) f(p(x)/q(x)). If P is not absolutely continuous with respect to Q, D_f(P||Q) is infinite.
Common choices of f yield well-known divergences. For example, f(t) = t log t gives the Kullback–Leibler
Key properties include non-negativity and identity of indiscernibles: D_f(P||Q) ≥ 0, with equality if and only if
Applications span statistics, information theory, and machine learning. F-divergences are used to quantify model discrepancies, robust