SteepestDescentMethode
SteepestDescentMethode, commonly known as the steepest descent or gradient descent method, is a classical algorithm for unconstrained optimization of differentiable functions. The central idea is to move at each iteration in the direction of steepest decline of the objective function, which under the Euclidean norm is the negative gradient.
Given a differentiable function f: R^n → R and a current point x_k, the descent direction is p_k
For quadratic functions with a symmetric positive definite Hessian, a closed-form expression can determine the optimal
Variants and enhancements include line search strategies, fixed or adaptive step sizes, and preconditioning to improve