Home

LinienSuche

LinienSuche is a term used in optimization to describe a family of techniques for selecting an appropriate step size along a given search direction. In iterative methods such as gradient descent, Newton’s method, or quasi-Newton methods, LinienSuche aims to find a scalar alpha that advances the current estimate x_k to a new point x_k + alpha p_k in a way that yields a sufficient decrease of the objective function f.

The core idea is to transform the multidimensional problem into a univariate subproblem: minimize f(x_k + alpha

Common methods for solving the subproblem include backtracking with Armijo or Wolfe conditions, bracketing and univariate

LinienSuche contrasts with trust-region approaches, which control step size via a radius rather than a directional

p_k)
with
respect
to
alpha
≥
0.
Practical
criteria
guide
the
choice
of
alpha.
The
Armijo
(sufficient
decrease)
condition
requires
a
decrease
proportional
to
the
directional
derivative:
f(x_k
+
alpha
p_k)
≤
f(x_k)
+
c1
alpha
grad
f(x_k)ᵀ
p_k.
More
stringent
Wolfe
conditions
add
a
curvature
requirement
to
ensure
adequate
reduction
in
the
gradient,
with
variants
such
as
strong
Wolfe.
Backtracking
line
search
iteratively
reduces
alpha
until
the
chosen
condition
is
satisfied.
minimization
techniques
(Golden-section
search,
Brent’s
method),
and,
in
some
cases,
exact
line
search
for
specific
function
forms.
Exact
line
search
is
rarely
used
in
practice
due
to
evaluation
costs,
especially
in
high
dimensions.
line
search.
It
improves
robustness
and
convergence
reliability
of
optimization
algorithms
by
ensuring
sufficient
descent
at
each
iteration,
at
the
cost
of
additional
function
and
gradient
evaluations
per
step.
Applications
span
numerical
optimization,
machine
learning,
and
scientific
computing.