minibatch
Minibatch refers to a method in machine learning where the training data is divided into small batches used to compute gradient estimates. It sits between full-batch gradient descent, which uses the entire dataset to compute a gradient, and stochastic gradient descent, which uses a single example. Minibatch gradient descent uses a batch of examples per iteration, balancing computational efficiency with gradient variance.
Typical minibatch sizes range from 16 to 256 examples, with 32, 64, and 128 commonly used. The
Training with minibatches generally proceeds by shuffling the training data, partitioning it into batches, and iterating
Trade-offs include the impact of batch size on convergence and generalization. Very large batches require more