Notes for August 12, 2005
Complexity
We are interested in how problems scale--i.e., how much
slower do programs become as the input size gets larger?
Here are the complexity classes we discussed (notes courtesy of Stuart Reges).
- O(n): Linear algorithms like linear search that we used to
implement indexOf in the original IntList class.
- O(log n): Logarithmic algorithms like binary search that we used
in SortedIntList.
- O(1): Constant time algorithms whose execution time does not
depend on n (e.g., elementData[i]).
- O(n2): Quadratic algorithms like the removeAll method
from section that had a loop that called remove, which itself had a loop
to shift array elements to the left.
- O(n3): Cubic algorithms that would typically involve a
triply-nested loop like matrix multiplication.
- O(n log n): This falls between the O(n) and O(n2)
algorithms. Merge sort is one such algorithm.
- O(2n): Exponential algorithms. These algorithms are so
slow that they are only practical for small inputs.
Maximum Subsequence Sum
Below is a table showing the relationship between the algorithm and the
actual number of steps it takes depending on the array size.
Algorithm |
Number of steps |
Order of growth |
1 |
(n3 + 3n2 + 2n) / 6 |
O(n3) |
2 |
(n2 + n) / 2 |
O(n2) |
3 |
n |
O(n) |
It is easy to see that the third algorithm takes exactly n
steps. To see how to compute the number of steps for the second
algorithm, notice that start ranges from 0 to
(list.length-1). For each value of start,
stop ranges from start to (list.length-1). So
for each value of start, we can see how many times the
lineCount variable is updated:
start |
Number of times lineCount is updated |
0 |
n |
1 |
n-1 |
2 |
n-2 |
... |
... |
n-1 |
1 |
The right column is a series from 1 to n. The sum is
(n * (n + 1)) / 2 or (n2 + n) / 2.