Orders of Growth Quick Review
When solving a problem, there’s always a lot of different ways that you can approach it. However, some ways are definitely more efficient than others. Imagine trying to create a function that takes in two numbers and adds them together; we could use the built-in add() function, or we could do a complicated twenty step recursive process and then add them. Clearly, the first way is better, and it seems obvious that you would want to do that, but sometimes it’s not so clear. That’s why we need orders of growth!
Order of Growth = How a function’s runtime grows as its input get larger
Big Theta = A function that represents your runtime as the input gets very large
Simplifying To Big Theta
When writing the big theta runtime for a function, we drop all lower terms and constants. You can think about this as being very similar to how we don’t care about lower terms and constants when we take limits in calculus.
3n^3 + 2n^2 + n → Θ(n^3)
How to Figure out runtimes
The runtime of a function is always dependent on the amount of “work” being done; the more work, the longer a function will take to run. In our addition example from earlier, we can see that using Python’s built-in add() takes only one step (or one “unit of work”), whereas doing the twenty step process takes twenty units of work. Whenever you are trying to figure out a runtime, it can sometimes be helpful to think of it in terms of work.
Every line of a function takes some amount of work, so it affects its function’s runtime in some way. Many lines in a function, like simple operations or variable assignments, are constant time operations, and so have little impact on the big theta runtime. However, there are a few things that usually do have a significant impact on a function’s runtime:
Loops
Recursive calls
Calls to other functions
Now how do you actually use this information to figure out a function’s big theta runtime? By following these three steps:
Figure out the runtime of each line of code that is not part of a loop or recursive call
Figure out the runtime of all loops
Figure out the runtime of all lines that are recursive calls
When doing step 1, go through each non-loop, non-recursive line of code and write it’s big theta runtime beside it; most simple operations will run in constant time (like adding, max, or variable assignment), but for any calls to other functions you need to check the runtime of that function.
When doing step 2 and looking at loops, there are two important things to figure out about the loop: how many iterations of the loop there are, and the runtime of each iteration of the loop. If you can figure those two things out, then multiplying them together gives you the overall runtime of the loop (intuitively, this is just because we’re finding (number of iterations) x (amount of time each iteration takes)).
When doing step 3 and looking at recursive calls, you should already know the runtime of the rest of the function. A recursive call itself is just a function call, and the action of calling a function is actually a constant time operation (though running that function’s body may take longer than constant time). This means that we can say that the overall runtime of one particular call to your recursive function is just equal to the runtime of the rest of the function. So if we have a recursive function f where everything except the recursive call runs in Θ(n^2) time, this means that the recursive call f(n-1) will run in Θ((n-1)^2), the recursive call f(n/2) will run in Θ((n/2) ^2) and so on.
With recursive functions, there are likely to be many recursive calls with many different inputs. The best thing to do is to draw out a tree of the recursive calls, and using what I just explained in the paragraph above to figure out how much time each individual recursive call is taking. Then, you can sum up the runtime of each of the nodes in your tree to figure out your overall runtime for the recursive function.
Still super confused? Let’s walk through an example.
def oog(n):
if n == 0:
return 0
total, counter = 0, n
while counter > 0:
total += 1
counter -= 1
return total + oog(n/2)
Okay, so we can see that we have a loop and a recursive call…yikes. Let’s start with step 1: figure out the runtime of each line of code that is not part of a loop or recursive call. I’m going to bold them in the function below.
def oog(n):
if n == 0:
return 0
total, counter = 0, n
while counter > 0:
total += 1
counter -= 1
return total + oog(n/2)
So we can see that we have three lines that are not a part of any loop/do not contain any recursive calls. Since they are all simple operations (a conditional check, a return statement, and a variable assignment) each of them will take constant time. Great! We can pretty much just ignore them now, as they will not really factor into our calculations anymore :)
def oog(n):
if n == 0:
return 0
total, counter = 0, n
while counter > 0:
total += 1
counter -= 1
return total + oog(n/2)
Alright, so now let’s move on to step 2 and start thinking about our loop. We can see that the loop is going to run n iterations because counter starts out with value n, and each iteration, counter is decreased by 1. We can also see that the body of the loop takes constant time, since we have two simple operations (addition and subtraction). This means that the overall runtime of our loop is n x 1, which gives us Θ(n).
def oog(n):
if n == 0:
return 0
total, counter = 0, n
while counter > 0:
total += 1
counter -= 1
return total + oog(n/2)
Now let’s move on to step 3 and focus on the return statement, which we can see contains a recursive call. Let’s try to draw out a tree. We can see that each time we recursively call oog, we divide n by 2, and we continue doing so until we hit our base case, which is when n = 0. We also know that the rest of the function (excluding the recursive calls) takes Θ(n) work (which we calculated above!). This means that for the call oog(n//2), we have Θ(n//2) work being done, and for the call oog(n//4), we have Θ(n//4) work being done, and so on.
If we look at the amount of work, on each level, we can see that it sums to Θ(n). In terms of the number of levels, we can see that we have log n, because for each level, we divide n by 2, which means that the height of our tree is log base 2 of n (if you’re not convinced of this, try it out with some small examples, like 8 or 16). This means that our overall runtime is Θ(nlogn), which we can get by summing up the amount of work done on each level.
If you’re still confused, I recommend practicing! It takes awhile to build a good intuition for these problems, and only practice can help with that.