Objectives
of this lecture
Introduction
Definition
f(n) is O(g(n))
[read f(n) is Big-O of g(n)]
if
there exists a constant c such that
|f(n)| £ c|g(n)| for all sufficiently large positive integers n
Under these conditions, we say that “f(n)
has order at most g(n) or ‘f(n) grows no more rapidly than g(n)’
Examples
1. If f(n) = 100n, and wetake c=100, then we have
f(n) £ c(n) for
all n ³ 0
Thus, f(n) is O(n)
2. If f(n) =4n+200, and we take c=5, then we
obtain
f(n) £ c(n) for
all n ³ 200
Again, f(n) is O(n)
3. If f(n)=n2. Suppose we try to show
that f(n) is O(n). Doing so means that
we could find a constant c such that
n2 £ cn for
sufficiently large n
If we devide both sides by n, we have
n £ c for
sufficiently large n.
Clearly this is not true since c is a
constant.
Thus, n2 is not O(n)
When analyzing algorithms, we normally take the worst case, which form the previous lecture we found to be n for sequential search.
i.e the performance of sequential search can be represented in terms of n as: f(n) = n
it obviously follows that f(n) is O(n)
Even if we take the average case:
f(n) =1/2(n+1)
We have f(n) £ n for
all n ³ 0
Thus f(n) is O(n).
The worst case we found from previous
lecture was log n
Thus the run time of binary search is
O(log n)
General
Observations
If f(n) is a polynomial in n with
degree r, then f(n) is O(nr), but f(n) is not O(ns) for any
s less than r.
Any logarithm of n grows more slowly (as n increases) than any positive power of n.
Thus, log n is O(nk)
And
nk is never O (log n) for k>O
Common
Orders
O(1) : computing time is constant (not dependent on n)
O(n) : computing time varies with n (linear)
O(n2) : quadratic
O(n3) : cubic
O(2n) : exponential
O(log n) : logarithmic
n |
log n |
n log n |
n2 |
n3 |
2n |
1 |
0.00 |
0 |
1 |
1 |
2 |
10 |
3.32 |
33 |
100 |
1000 |
1024 |
100 |
6.64 |
66 |
10,000 |
1,000,000 |
1.268x1030 |
1000 |
9.97 |
997 |
1,000,000 |
109 |
1.072x10301 |