minibatches
Minibatches are subsets of a training dataset used to update model parameters during an iteration of optimization. They lie between full-batch updates and single-sample updates, and are central to mini-batch gradient descent. By processing several examples at once, minibatches enable vectorized computation and more stable gradient estimates than stochastic updates, while avoiding the memory burden of a full batch.
Typical minibatch sizes range from small (16–32) to larger (256–512), with common defaults such as 32, 64,
In practice, data are shuffled and divided into minibatches each epoch. In many frameworks you specify a
Minibatches are widely used in deep learning and other machine learning settings. They enable efficient parallel
See also: stochastic gradient descent, batch gradient descent, batch normalization, data loader.
---