Home

Gradients

Gradients are a central concept in calculus describing how a scalar field changes in space. For a differentiable scalar function f: R^n → R, the gradient ∇f is the vector of its partial derivatives. In three-dimensional Cartesian coordinates, ∇f = (∂f/∂x, ∂f/∂y, ∂f/∂z). The gradient points in the direction of steepest ascent of f, and its magnitude gives the maximum rate of increase.

Geometrically, the gradient is perpendicular to the level sets or contour surfaces of the function, because

An example: if f(x,y) = x^2 + y^2, then ∇f(x,y) = (2x, 2y). At the point (1, -3), the

Computational use: gradients are fundamental to gradient-based optimization. Gradient descent and its variants iteratively move in

In practice, gradients can be approximated numerically by finite differences when an analytical form is unavailable.

level
sets
have
zero
directional
derivative
in
directions
tangent
to
the
set.
This
makes
gradients
useful
for
locating
where
a
function
increases
most
rapidly,
subject
to
constant
level.
gradient
is
(2,
-6),
indicating
ascent
most
rapidly
in
that
direction.
the
opposite
direction
of
∇f
to
minimize
f.
In
machine
learning,
backpropagation
computes
gradients
with
respect
to
network
parameters
to
update
them
via
an
optimization
algorithm.
Physically,
gradients
describe
fields
such
as
the
electric
or
gravitational
potential,
where
the
gradient
gives
the
corresponding
force
field.
The
concept
extends
to
any
differentiable
scalar
field
in
any
dimension,
with
∇
being
the
gradient
operator
under
appropriate
coordinates.