Notes for July 6, 2005

The readings are complete in terms of interfaces, stacks, and queues, so these notes will only discuss wrapper classes and complexity. Notes on casting will appear with the notes on inheritance in the next lecture.

Wrapper Classes

The Stack interface lets you add any type of object onto the stack. However, if you recall from the reading, an int is not an object (reference) type. So how can you push an int onto the stack? The answer is: you can't--at least, not directly. You have to have an object representation of that int. We do that with wrapper classes. In java.lang, there are wrapper classes for the primitives. They are easy to pick out, because they have analogous names to the primitive. The wrapper class for integers is Integer (duh). The description also states: "The Integer class wraps a value of the primitive type int in an object." (my emphasis)

To add a particular int x onto a stack s, you would write something like:

s.push(new Integer(x));
To retrieve the value from an Integer object number, you can use the intValue() method:
int x = number.intValue();

Complexity Theory

Complexity theory concerns itself with the question of how programs scale, i.e., what happens to the program when given larger inputs? There are two dimensions that a programmer is interested in: time (how long the algorithm takes to execute) and space (how much memory the algorithm consumes while executing). For this class, we are only interested in time. How long will it take to run a particular program? Will the program grind to a halt with twice as much input? What if the input is quadrupled?

This section will cover the very basics of complexity theory. We'll go into more depth at the end of the quarter. For now, all you need to know is the difference between O(1) (pronounced "big oh of 1" or just "oh of 1") and O(n). Any function of n can be found within the parentheses. Others include O(log n), O(n2), O(2n), etc... The function within the parentheses is a function of the size of the input, n. (Yes, 1 is a valid function of n. It is a constant function.) In a sense, it states how many "steps" it will take to perform a particular operation. To say that a method m takes O(n) time means that if there are n pieces of data, it will take m on the order of n steps to complete.

Does that mean that every method that is O(1) takes only one step to complete? The answer is no. Notice the italicized words in the last sentence of the previous paragraph. We are only interested about orders of growth and how the program scales with really large data. It does not matter if it takes one step or a million steps--in the grand scheme of things, they amount to the same thing. If there were ten billion pieces of data (n = 10,000,000,000), a million is a drop in the bucket. The number of steps to run the method is constant and is irrespective of the size of the data.

However, if the method in question was O(n), then the more data there is, the longer it will take. The length of time it takes grows proportionately with the size of the data. Processing one data set will take half as long as another data set twice its size.

As an example, consider the add method in the vanilla IntList class. That method is O(n). However, when we add back-pointers, it becomes O(1). Why?

Now refer back to the readings and see if you can understand why some operations are O(n) and why some are O(1).