A phenomenon where neural networks rapidly forget previously learned information when trained on new tasks.
Detailed Explanation
Catastrophic Forgetting occurs when neural networks quickly lose knowledge of previously learned tasks upon training on new ones. This challenge arises because the network's weights are updated to optimize for new data, often overwriting representations critical for earlier tasks. It poses a significant obstacle for developing models capable of continual learning across multiple domains without losing past information.
Use Cases
•Implementing continual learning models to retain old tasks while learning new ones prevents catastrophic forgetting.