AWQ (Activation-aware Weight Quantization) protects important weights from quantization damage.
It identifies weights that matter most for output quality and preserves them at higher precision.
AWQ often outperforms GPTQ on the same bit-width. Try both and compare on your specific task.