Home

reparameterizations

Reparameterizations are changes in how a quantity, model, or distribution is described by replacing its original parameters with a different set obtained via a bijective transformation. The goal is to express the same object in a new coordinate system that may be easier to analyze, estimate, or compute with. Reparameterizations preserve the underlying structure, but change the apparent complexity of the model or its constraints.

In statistics and machine learning, reparameterization often refers to transforming a parameter vector to a new

In geometry and numerical analysis, curves, surfaces, and other objects can be reparameterized by different choices

space
that
is
more
convenient
for
optimization
or
sampling.
When
densities
are
rewritten
under
a
new
parameterization,
the
change
of
variables
formula
requires
multiplying
by
the
absolute
value
of
the
Jacobian
determinant.
A
notable
example
is
the
reparameterization
trick
used
in
variational
autoencoders,
where
a
latent
variable
z
is
expressed
as
z
=
mu
+
sigma
*
epsilon
with
epsilon
~
N(0,I).
This
enables
gradients
to
flow
with
respect
to
mu
and
sigma.
In
Bayesian
computation,
unconstraining
constrained
parameters
(for
example,
modeling
a
positive
scale
via
a
log
transform
or
a
probability
via
a
logit
transform)
can
improve
convergence
of
MCMC
or
variational
methods.
of
parameter,
yielding
expressions
that
are
easier
to
differentiate
or
integrate.
Reparameterization
can
also
be
used
to
improve
conditioning
or
to
reveal
separable
structure
in
optimization
problems.