Home

momentmagnitude

Moment magnitude, abbreviated Mw, is a scale used to describe the size of earthquakes. It is based on the seismic moment M0, a measure of the total amount of slip on the fault, the fault area that ruptured, and the rigidity of the rocks. The seismic moment is defined as M0 = μ A D, where μ is the shear modulus of the rocks, A is the rupture area, and D is the average slip.

Moment magnitude is related to M0 by the empirical formula Mw = (2/3) log10(M0) − 6.07, with M0

History and purpose: The moment magnitude scale was developed in the late 20th century as an improvement

Calculation and data: Seismic moments are estimated from seismic waves recorded by networks of seismometers, often

Interpretation and limitations: Mw correlates with the total energy released and with the size of the rupture,

expressed
in
newton-meters.
Because
it
is
a
logarithmic
scale,
a
tenfold
increase
in
M0
corresponds
to
about
a
3-unit
increase
in
Mw.
Mw
is
designed
to
provide
a
consistent,
non-saturating
measure
of
earthquake
size
across
a
wide
range
of
faulting
styles
and
depths.
over
older
scales
such
as
the
Richter
local
magnitude.
It
uses
the
physical
source
parameter
M0
and
remains
stable
for
large
earthquakes,
whereas
earlier
scales
tended
to
saturate
for
big
events.
through
inversion
for
slip
on
fault
patches.
M0
depends
on
the
fault's
area,
average
slip,
and
rock
rigidity,
and
can
be
inferred
from
far-field
or
near-field
waveforms
as
well
as
geodetic
data.
but
is
not
a
direct
measure
of
radiated
energy.
Uncertainties
arise
from
slip
distribution,
rupture
geometry,
and
the
chosen
mantle
properties.
For
small
earthquakes,
Mw
converges
toward
values
similar
to
other
scales;
for
large
events,
Mw
remains
more
reliable
than
older
magnitudes.