Big O notation gives an upper bound on runtime in the worst case. If an algorithm is O(n), it will never take longer than some constant times n steps, no matter what input you give it.
This is not an average. This is not a precise count. This is a mathematical guarantee that describes the growth pattern as n gets large.
When you see O(n²), you know that doubling input size could quadruple runtime.