That O() is the so called Landau Symbol, its a way to express the performance complexity of an algorithm. For example, the complexity to search for something in an array is O(log N) where N is the number of elements in the array. This means that the complexity grows logarithmic with the number of elements in the array. O(1) means that the complexity is constant no matter what.
When talking about data structures and algorithms, an easy way to understand the difference is to use the landau symbol and based on this choose the appropriate algorithm and data structure to work with.

A few examples:
Accessing something randomly from an hash table is O(1) in the average case because its just an access to a known index determined by the computed hash. A random access into an array is O(log N) in the average case because you first have to crawl the array to find the object, though the best case is O(1) if the searched element is at the beginning of your search and the worst case is O(N) if the element at the end of the list (because then you first have to search through every element in the array). However, if you access a known position in an array through its index, it suddenly becomes also O(1) because then you don't have to search for the element first.
Same with deletion and insertion, a hash table needs O(1) for this in the average case (read; No overflow buckets due to hash collisions, but still then the bias is strongly towards O(1) because the overflow buckets are usually very limited). A array on the other hand needs also for this O(log N) average because again, you have to rearrange the array after the deletion/before the insertion to avoid either gaps or overwritting an existing element. But yeah, depending on the position this can also become O(1) or O(N).


Shitlord by trade and passion. Graphics programmer at Laminar Research.
I write blog posts at feresignum.com