Momentumbased
Momentumbased is a term used to describe methods and models that incorporate a momentum component to guide iterative updates. The idea borrows from physics, where momentum represents inertia; in computational contexts, momentum refers to a running aggregate of past updates or gradients that influences current steps. In optimization and machine learning, momentum-based methods maintain a velocity vector that blends information from recent gradients. A typical form uses v_t = beta*v_{t-1} + (1-beta)*g_t, followed by updating parameters theta_t = theta_{t-1} - alpha*v_t, where alpha is a learning rate and beta in [0,1) controls the memory length. This mechanism smooths noisy gradients and helps accelerate progress along consistent descent directions, while dampening oscillations perpendicular to those directions.
Momentum-based approaches are widely used in neural network training, convex optimization, and signal processing. They can
Common momentum-based algorithms include gradient descent with momentum and Nesterov accelerated gradient, with adaptive variants such
See also: momentum (physics), optimization, stochastic gradient descent, Nesterov momentum, Adam.