lowdiscrepancy
Low-discrepancy refers to a property of certain deterministic sequences designed to fill the unit hypercube more uniformly than typical random samples. Such sequences are used in quasi-Monte Carlo methods to approximate integrals and perform sampling in multiple dimensions. The goal is to minimize discrepancy, a quantitative measure of how far the empirical distribution of sample points is from the uniform distribution.
Discrepancy is commonly measured by the star discrepancy, D*_N, which captures the largest difference between the
Notable examples of low-discrepancy sequences include Halton sequences, Sobol sequences, and Niederreiter sequences, as well as
Applications span numerical integration in finance, physics, and engineering; computer graphics for global illumination and rendering;