Big O notation measures how runtime grows as input size increases. We call this variable n.
For an array, n is the length. For a number, n might be the value itself or the number of digits. For a graph, n could be vertices or edges.
The specific value does not matter. We care about the relationship: does doubling n double the time, square it, or barely change it?