Energybased
Energybased, in the context of machine learning and statistics, refers to energy-based modeling (EBM). EBMs are a class of probabilistic models that define a scalar energy function E(x) over data configurations x, with lower energy indicating more plausible or likely configurations. The probability of a configuration is proportional to the exponential of the negative energy, P(x) ∝ exp(-E(x)/T), where T is a temperature parameter and Z = ∑x exp(-E(x)/T) (or an integral in continuous spaces) is the normalization constant, also called the partition function. In many practical problems Z is intractable, so training and inference rely on approximate methods.
In EBMs the energy function can be parameterized by neural networks or other flexible function approximators,
Training EBMs typically involves methods that bypass exact computation of Z. Common techniques include contrastive divergence
Key examples in the history of EBMs include Boltzmann machines and Restricted Boltzmann Machines. Modern deep