Overfitting means the model memorizes training examples rather than learning general patterns. Training loss keeps dropping but validation loss increases.
Signs of overfitting:
- Model reproduces training examples verbatim
- Performance degrades on new inputs
- Validation loss diverges from training loss
Overfitting is your biggest enemy in fine-tuning. More data, fewer epochs, and regularization help prevent it.