When a fine-tuned model loses important general knowledge it had before.
"After we fine-tuned the model too narrowly, it forgot how to handle basic grammar—classic catastrophic forgetting."