strictsteep
Strictsteep is a term used in numerical optimization to denote a conservative variant of the steepest descent method that imposes a strict decrease of the objective function at every iteration. In this framework, the search direction is the steepest descent direction p_k = -∇f(x_k). A backtracking line search is applied along this direction to locate a step size α_k that yields a strict decrease: f(x_k + α_k p_k) < f(x_k). If the line search fails to find any α_k within the allowed range, the algorithm may terminate with stagnation or switch to a different update rule.
The line search typically uses a reduction factor β in (0,1) and a decrease condition similar to,
Convergence properties of strictsteep follow those of gradient-based methods under standard assumptions: if f is continuously
Strictsteep is related to other gradient methods that employ line searches, such as gradient descent with Armijo