Big O notation - Is a way to describe how slow, or fast a function is going to run with different input sizes.
consider the following code snippet:
def item_in_list(to_check, the_list):
for item in the_list:
if to_check == item:
return True
return False
The complication for this function is O(n).
what does O(n) means?
First we need to know what does Big-O mean.
O(n)
The O stands for "Order Of Magnitude" which is a mathematical term for dealing with different classes of numbers, for example the difference between 10 and 1,000.
The O() in turn is called the "Order Function".
When dealing with Orders of Magnitude we are dealing with approximations.
and O(n) is the approximation of the worst case scenario, which is the running the O-function (O()) n times (input size) to get wanted result (a return statement in our case). Its an approximation because we don't know the size of n, or the times it will have to execute to get the result, but we do know that worst case scenario is running n times.
You can represent O(n) in a graph plot, where the n would be swapped with x, representing the x axis (the different sizes of n), and y representing the time of execution for O(n).
Plotting our graph in that way will give us a straight diagonal line the will basically multiply for each x/n unit.
This means that it might run fast for n=1 but will be unbearably slow for n=1,000,000
This is the worst case scenario.
O(1)
Given the following function:
def is_none(item):
return item is None
I doesn't matter the size of the item you pass it, it will always execute in the same time.
Pass it million items or one items, it will just return something right away.
This means that the complication here is O(1), which is also called "static Time".
Its called static time because if we plot this one in a graph, like we did with O(n), we'll get a straight line, which is a constant time.
the Y axis always stays the same, no matter the size of X or n if you will.
Constant time or n(1) is considered the best case scenario for a function.
O(n^2)
consider the following function:
def all_combinations(the_list):
results = []
for item in the_list:
for inner_item in the_list:
results.append((item, inner_item))
return results
The above function will loop once over each list item (n), then with in that loop it will loop again over each item array to match it with the current item array.
So if you pass in [1,2,3] you will get: [[1,1],[1,2],[1,3],[2,1],[2,2]... [3,3]]
This means that you ran n times, and for every item in n you ran n times which means you ran n*n or n^2 times.
If you plot O(n^2) into a graph you'll get a curved line, which performance is even worse than O(n).
Sort algorithms worst case complexity:
Selection sort: О(n^2)
Bubble sort: О(n^2)
Merge sort: O(n log n)