Taillowerbounds
Taillowerbounds are inequalities that establish a nontrivial lower bound on the probability that a random variable deviates from its typical value by at least a specified amount. They are used to guarantee that rare events have at least a minimum likelihood, in contrast to more familiar upper-tail bounds that restrict how unlikely such events can be. In formal terms, for a nonnegative random variable X and a threshold t, a taillowerbound asserts that P(X ≥ t) ≥ g(t) for some function g(t) that is positive on a region of interest. Analogous statements may be made for the lower tail, P(X ≤ t) ≥ h(t), for thresholds t below a central value.
Common tools for deriving taillowerbounds include moment-based inequalities such as the Paley–Zygmund inequality, which, for a
Applications of taillowerbounds appear in risk assessment, reliability engineering, and the analysis of randomized algorithms, where
See also: tail bounds, Chernoff bounds, Hoeffding inequalities, Paley–Zygmund inequality, large deviations.