Home

EWC

Elastic Weight Consolidation (EWC) is a continual learning method designed to mitigate catastrophic forgetting when neural networks learn tasks sequentially. It protects weights that are important to previously learned tasks while allowing less essential weights to adapt to new ones.

It does so by adding a regularization term to the loss for the current task. After learning

Consequences: Important weights are protected, reducing forgetting while allowing less important weights to adapt. The approach

Applications and context: EWC has been used in image classification, reinforcement learning, and robotics. It is

a
task,
the
method
estimates
the
importance
of
each
parameter
using
the
Fisher
Information
Matrix.
The
diagonal
elements
F_i
describe
how
sensitive
performance
is
to
changes
in
theta_i.
The
objective
becomes
L_total
=
L_current
+
(lambda/2)
sum_i
F_i
(theta_i
-
theta_i*)^2,
where
theta_i*
is
the
old
value
and
lambda
controls
the
penalty
strength.
relies
on
a
diagonal
FIM
approximation
for
efficiency
and
may
be
less
effective
if
tasks
are
highly
dissimilar
or
many
tasks
are
learned.
Variants
include
Online
EWC
and
methods
that
relax
the
diagonal
assumption
or
combine
EWC
with
replay.
part
of
a
family
of
regularization-based
continual
learning
methods,
alongside
Synaptic
Intelligence
and
Memory
Aware
Synapses,
and
contrasts
with
replay-based
approaches
that
reuse
past
data.