gradientmethoden
Gradient methods, also known as gradient-based optimization methods, are a class of iterative algorithms used to find local minima or maxima of a function. They are fundamental in many fields, including machine learning, statistics, and operations research. The core idea is to move in the direction of the steepest descent or ascent of the function, which is determined by its gradient. The gradient is a vector of partial derivatives that indicates the direction and magnitude of the greatest rate of increase of the function at a particular point.
The simplest gradient method is called gradient descent. To minimize a function, gradient descent starts at
Variations of these basic methods exist to improve convergence speed and robustness. Examples include stochastic gradient