You don't have to apply LoRA to every layer. Common targets:
- Query and Value projections (q_proj, v_proj): Most common, good default
- All attention projections (q, k, v, o): Better quality, more parameters
- Attention + FFN: Maximum adaptation, highest parameter count
Start with q_proj and v_proj. Add more if you need stronger adaptation.