Subgradientbased
Subgradient-based methods are a class of optimization algorithms used to solve convex optimization problems, particularly when the objective function is not differentiable everywhere. These methods are an extension of gradient-based methods, which are typically used for differentiable functions. In subgradient-based methods, the subgradient is a generalization of the gradient for non-differentiable functions. A subgradient of a convex function at a point is any vector that provides a global underestimate of the function's rate of change at that point.
The subgradient method iteratively updates the solution by moving in the direction of the subgradient. This
One of the key advantages of subgradient-based methods is their simplicity and ease of implementation. They
In practice, subgradient-based methods are often used as a fallback when gradient-based methods are not applicable.