Description
Neural network models suffer from the phenomenon of catastrophic forgetting: a model can drastically lose its generalization ability on a task after being trained on a new task. This usually means a…
Summary
- Forgetting in Deep Learning Problem Statement Neural network models suffer from the phenomenon of catastrophic forgetting: Percentage change in forgetting with different levels of mixup, compared to the baseline model.
- One observation is that the diagonal seems to have the least forgetting, which corresponds to cases when task 1 and task 2 have the same initial learning rates.
- Train Epoch Experiments We set up experiments by fixing task 2 training epochs to learn the effect of number of epochs for task 1 on forgetting.