RBM
RBM stands for Restricted Boltzmann Machine, a generative stochastic neural network that learns a probability distribution over its inputs. It consists of two layers: a visible layer representing observed data and a hidden layer that captures dependencies. The network is a bipartite graph with no connections within a layer, and the units are typically binary stochastic neurons. The joint distribution over visible and hidden units is defined by an energy function E(v,h) = -a^T v - b^T h - v^T W h, where W is the weight matrix and a, b are biases. Because there are no intra-layer connections, the conditional distributions p(h|v) and p(v|h) are tractable, enabling efficient Gibbs sampling.
RBMs are trained to maximize the likelihood of the training data. Exact maximum likelihood is intractable,
Applications encompass unsupervised feature learning, dimensionality reduction, collaborative filtering, and data denoising. In modern practice, RBMs
Limitations include relatively slow training for large datasets, sensitivity to hyperparameters, and scalability challenges compared with