EWC
Elastic Weight Consolidation (EWC) is a continual learning method designed to mitigate catastrophic forgetting when neural networks learn tasks sequentially. It protects weights that are important to previously learned tasks while allowing less essential weights to adapt to new ones.
It does so by adding a regularization term to the loss for the current task. After learning
Consequences: Important weights are protected, reducing forgetting while allowing less important weights to adapt. The approach
Applications and context: EWC has been used in image classification, reinforcement learning, and robotics. It is