Match hardware to your task:
- QLoRA 7B: RTX 3090/4090 (24GB) - 1000-2000
- LoRA 13B: A100 40GB - ~1/hour cloud
- Full fine-tune 7B: 2x A100 80GB - ~4/hour cloud
- Full fine-tune 70B: 8x A100 80GB - ~16/hour cloud
Cloud is often more economical than buying unless you train constantly.