Edit: Big-O notation is supposed to give you a sense of how an algorithm will scale as the size of the dataset that it is working on increases. If an algorithm takes 1 second to process one piece of data, and 10 seconds to process 10 pieces, it may be linear. O(n) represents linear -- graph out y = x to understand that. An algorithm that takes 100 seconds to process 10 items might be on the order of n^2. As the data set increases, it takes a long time for the algorithm to finish. In your assignment, the first several problems already provide you the complexity of different components of some algorithm (the actual algorithm is unimportant as the analysis has already been done for you). The second part of your assignment gives you a chance to try to come up with those complexity functions yourself. Big O is not concerned with the ACTUAL complexity, but rather with the overall speed of growth as the data set increases. The mentality here is that no matter how fast you make your code, the growth rate of the curve will always look the same. It is really used to compare algorithms to one another rather than provide any real indication of how long a particular algorithm will take.
In Big-O notation, the largest "order" is the winner. Until you get used to identifying order by sight, a good way to handle these problems is to:
1) split out the functions into a sum of parts
2) for each part, replace any constant coefficients with 1.
3) keep ONLY the part the grows the fastest on a graph.
4) repeat steps 1-3 until you are unable to split the function up any more
a) Apply the procedure I laid out:
1. split into N^2 and 3N
2. throw away the 3. N^2 and N remain
3. N^2 grows fastest. Keep it.
4. We can't split up N^2 so we are done. The answer is O(N^2)
b)
1. 3N^2 and N
2. N^2 and N
3. N^2 grows fastest
4. Done. It is O(N^2)
c)
1. N^5, 100N^3, 245
2. N^5, N^3, 1
3. N^5 grows fastest
4. Done, it is O(N^5)
d)
1. 3NLog2N, N^2
2. NLog2N, N^2 (I'm not sure I'm reading Log(2N) correctly)
3. I know Log(2N) is slower than N, so N^2 is faster
4. Done, it is O(N^2)
you can do e)
f)
1. (N * (N - 1) ) / 2
2. N * (N - 1)
3. Keep it all and simplify N^2 - N
1. N^2, -N
2. N^2, N
3. N^2
4. Done... O(N^2)
44. I'll give you the formulae... you can solve the order. Notice that to be absolutely complete, I can represent EVERY line in an order equation. Loops multiply... everything else adds.
a) 1 + N * (1)
b) 1 + N * (N * (1)
c) 1 + 1 + ...
ok... this can be tricky... you have to investigate the algorithm to see how long it will take to run. Worst case, you can keep a counter inside the loops. Vary the N and see how it relates to the counter. I can tell looking at this that it divides the number over and over until it reduces to 1. This takes log2(N) time. Not all algorithms are open to analysis. If a look runs and will complete at some unknown time in the future, it is simply labeled "decidable". If it might NEVER end in some situations, then it is "partially decidable". If it will NEVER end ever, it is "undecidable". These are the worst three classifications to be in. But you are probably not there in your studies, and will not encounter them as problems.
so... 1 + 1 + log2(N)(1 + 1)
Not even tricky...
1. 1,1,log2(n),log2(n)
2. same
3. log2(n) grows the fastest (pick either one, they are the same)
4. Can't break up log2(n), so we're done. Answer is O(log2(n))