CHAPTER 2 ■ THE BASICS
14
Rules of the Road
While the definitions of the asymptotic operators can be a bit tough to use directly, they actually lead to some of the
simplest math ever. You can drop all multiplicative and additive constants, as well as all other “small parts” of your
function, which simplifies things a lot.
As a first step in juggling these asymptotic expressions, let’s take a look at some typical asymptotic classes, or
orders. Table 2-1 lists some of these, along with their names and some typical algorithms with these asymptotic
running times, also sometimes called running-time complexities. (If your math is a little rusty, you could take a look at
the sidebar “A Quick Math Refresher” later in the chapter.) An important feature of this table is that the complexities
have been ordered so that each row dominates the previous one: If f is found higher in the table than g, then f is O(g).
5
Table 2-1. Common Examples of Asymptotic Running Times
Complexity Name Examples, Comments
Q(1) Constant Hash table lookup and modification (see “Black Box” sidebar on dict).
Q(lg n) Logarithmic Binary search (see Chapter 6). Logarithm base unimportant.
7
Q(n) Linear Iterating over a list.
Q(n lg n) Loglinear Optimal sorting of arbitrary values (see Chapter 6). Same as Q(lg n!).
Q(n
2
) Quadratic Comparing n objects to each other (see Chapter 3).
Q(n
3
) Cubic Floyd and Warshall’s algorithms (see Chapters 8 and 9).
O(nk) Polynomial k nested for loops over n (if k is a positive integer). For any constant k > 0.
W(kn) Exponential Producing every subset of n items (k = 2; see Chapter 3). Any k > 1.
Q(n!) Factorial Producing every ordering of n values.
Note ■ Actually, the relationship is even stricter: f is o(g), where the “Little Oh” is a stricter version if “Big Oh.”
Intuitively, instead of “doesn’t grow faster than,” it means “grows slower than.” Formally, it states that f(n)/g(n) converges
to zero as n grows to infinity. You don’t really need to worry about this, though.
Any polynomial (that is, with any power k > 0, even a fractional one) dominates any logarithm (that is, with any
base), and any exponential (with any base k > 1) dominates any polynomial (see Exercises 2-5 and 2-6). Actually, all
logarithms are asymptotically equivalent—they differ only by constant factors (see Exercise 2-4). Polynomials and
exponentials, however, have different asymptotic growth depending on their exponents or bases, respectively. So, n
5
grows faster than n
4
, and 5
n
grows faster than 4
n
.
The table primarily uses Q notation, but the terms polynomial and exponential are a bit special, because of
the role they play in separating tractable (“solvable”) problems from intractable (“unsolvable”) ones, as discussed
in Chapter 11. Basically, an algorithm with a polynomial running time is considered feasible, while an exponential
one is generally useless. Although this isn’t entirely true in practice, (Q(n
100
) is no more practically useful than
Q(2n)); it is, in many cases, a useful distinction.
6
Because of this division, any running time in O(n
k
), for any k > 0,
5
For the “Cubic” and “Polynomial” row, this holds only when k ³ 3.
6
Interestingly, once a problem is shown to have a polynomial solution, an efficient polynomial solution can quite often be
found as well.
7
I’m using lg rather than log here, but either one is fine.