gradientiiven
Gradientiiven is a theoretical framework in optimization and applied mathematics that describes a family of iterative methods for finding minima of differentiable functions. The central idea is to modify the traditional gradient descent update by making the step not only a function of the gradient but also of an internal gradient-influence signal that reflects local geometry and gradient variability. The aim is to improve stability and convergence in non-convex, noisy, or ill-conditioned landscapes.
In a gradientiiven method, the update at iteration t takes the form x_{t+1} = x_t - α_t G_t
Common variants include gradientiiven with adaptive scaling, gradientiiven with momentum, and stochastic gradientiiven for large-scale data.
The concept has been explored in abstract optimization theory and in machine learning research as a generalization