Home

Lorentztransformer

Lorentz Transformer, sometimes written Lorentztransformer, is a neural network architecture within the transformer family that integrates concepts from Lorentzian geometry into the attention framework. It is proposed to model data where spacetime structure or invariances are important, extending the standard attention mechanism by incorporating a Lorentzian metric or related spacetime embeddings.

In typical designs, input tokens are augmented with time and space coordinates, and attention weights are computed

Variants of the approach may include Lorentzian attention, Lorentz-space embeddings, or hybrid schemes that combine traditional

Applications are discussed in contexts such as physics-informed modeling, spacetime event analysis, and domains where precise

See also: Transformer, Attention mechanism, Lorentz invariance, Minkowski space, Relativistic machine learning.

using
a
Lorentzian
distance
or
a
Minkowski
metric
rather
than
a
purely
Euclidean
similarity.
Some
variants
modify
the
positional
encoding
to
reflect
spacetime
coordinates,
or
use
Lorentzian
distance
as
a
similarity
measure
to
emphasize
or
de-emphasize
relations
according
to
relativistic-like
constraints.
attention
with
spacetime-aware
components.
The
overall
goal
is
to
capture
relations
that
depend
on
the
ordering
and
timing
of
events
in
a
way
that
respects
certain
invariances
associated
with
spacetime.
timing
and
causal
structure
matter.
Potential
benefits
include
improved
modeling
of
long-range
dependencies
and
robustness
to
timing
perturbations,
though
these
benefits
can
come
with
higher
computational
cost
and
added
complexity
in
optimization
and
numerical
stability.