Home

Minimization

Minimization is a mathematical process of finding the smallest value of a function and, often, the point at which that value is attained. In optimization, a minimization problem seeks x in a domain X to minimize an objective function f: X → R. The minimum value is min f(x), and a corresponding point x* is a minimizer.

Minimization problems are often categorized as unconstrained, where x ranges over a space such as R^n, and

For differentiable functions, necessary conditions for an interior minimizer include that the gradient of f at

Convexity plays a central role: if f is convex on a convex domain, any local minimum is

Common contexts include science and engineering, economics, and machine learning, where one seeks to minimize loss,

constrained,
where
x
must
satisfy
constraints.
The
minimizer
may
be
global
(the
smallest
value
over
the
entire
domain)
or
local
(smallest
value
within
a
neighborhood).
Not
every
function
has
a
minimizer
on
every
domain;
existence
depends
on
properties
such
as
continuity,
compactness,
and
coercivity.
x*
is
zero.
Second-order
conditions
involve
the
Hessian
being
positive
semidefinite.
For
constrained
problems,
the
method
of
Lagrange
multipliers
provides
conditions
that
combine
the
gradient
of
f
with
gradients
of
the
constraints.
global;
if
f
is
strictly
convex,
the
minimizer
is
unique.
In
practice,
many
minimization
tasks
are
solved
numerically
by
gradient-based
methods
(gradient
descent,
Newton's
method)
or
by
specialized
algorithms
for
linear
or
convex
problems
(linear
programming,
interior-point
methods).
cost,
or
risk.
The
concept
is
dual
to
maximization
and
to
the
broader
field
of
optimization.