Rényi
Alfréd Rényi (14 August 1921 – 1 February 1970) was a Hungarian mathematician who made foundational contributions to probability theory and information theory. He is best known for introducing Rényi entropy and Rényi divergence, families of information measures that generalize Shannon entropy and the Kullback-Leibler divergence by parameterizing emphasis on events of different probability. Rényi entropy of order alpha is defined for a discrete distribution P as H_alpha(P) = (1/(1 - alpha)) log sum_i p_i^alpha; as alpha approaches 1 it recovers Shannon entropy, and varying alpha emphasizes different aspects of the distribution. Rényi divergence D_alpha(P||Q) generalizes KL-divergence and provides a family of dissimilarities between distributions.
Rényi also contributed to probability theory and stochastic processes, and he coauthored with Paul Erdős the
He studied at Eötvös Loránd University and spent much of his career in Hungary; his ideas influenced