The document discusses asymptotic analysis and big-O, Ω, and Θ notation for analyzing how algorithms scale with increasing input size. It defines asymptotic analysis as depicting the running time of an algorithm as the input size increases without bound. It provides examples of using asymptotic notation to classify functions based on their growth rates and compares the orders of common functions like n, n2, n3, logn.
2. Asymptotic notation
• Asymptotic efficiency of algorithms depicts how the running time
of an algorithm increases with the size of the input in the limit, as
the size of the input increases without bound.
• Asymptotic running time of an algorithm is defined in terms of
functions whose domains are the set of natural numbers
N = {0, 1, 2, . . .}.
which is defined only on integer input sizes.
2
3. Asymptotic Analysis
• To compare two algorithms with running times f(n)
and g(n), we need a rough measure that
characterizes how fast each function grows.
• Hint: use rate of growth
3
4. Rate of Growth
• Consider the example of buying elephants and goldfish:
Cost: cost_of_elephants + cost_of_goldfish
Cost ~ cost_of_elephants (approximation)
• The low order terms in a function are relatively insignificant for
large n
n4 + 100n2 + 10n + 50 ~ n4
i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same rate
of growth
4
5. Types of Analysis
• Worst case
– Provides an upper bound on running time
– An absolute guarantee that the algorithm would not run longer, no matter
what the inputs are
• Best case
– Provides a lower bound on running time
– Input is the one for which the algorithm runs the fastest
• Average case
– Provides a prediction about the running time
– Assumes that the input is random
5
Lower Bound RunningTime Upper Bound
7. Big-O Notation
• We say fA(n)=30n+8 is order n, or O (n)
It is, at most, roughly proportional to n.
• fB(n)=n2+1 is order n2, or O(n2).
It is, at most, roughly proportional to n2.
• In general, any O(n2) function is faster-
growing than any O(n) function.
7
8. Visualizing Orders of Growth
• On a graph, as
you go to the
right, a faster
growing
function
eventually
becomes
larger...
8
fA(n)=30n+8
Increasing n
fB(n)=n2+1
Value
of
function
10. Understanding Big-O notation
• f(n) = 2n2 + n
• 2n2 + n <= c . g(n2)
• 2n2 + n <= 3 . n2
• 2n2 + n <= 3n2
• n <= 3n2 - 2n2
• n <= n2
Dividing both sides by n:
• 1 <= n
• n >= 1. It means n0 =1 and c=3 10
12. More Examples
• Show that 30n+8 is O(n).
–Show c,n0: 30n+8 cn, n>n0 . (for some c, and for all
n greater than n0)
• 30n+8 <= c g(n)
• 30n+8 <= cn ; Let c=31
• 30n+8 <= 31n
• 8 <= n i.e. n >= 8
• n0=8. Then
cn = 31n = 30n + n >= 30n+8, so 30n+8 <= cn for n0=8
12
13. Big-O example, graphically
• Note 30n+8 isn’t
less than n
anywhere (n>0).
• It isn’t even
less than 31n
everywhere.
• But it is less than
31n everywhere to
the right of n=8.
13
n>n0=8
Increasing n
Value
of
function
n
30n+8
cn =
31n
30n+8
O(n)
14. Examples
– n2 = O(n2):
– 1000n2+1000n = O(n2):
14
n2 ≤ cn2 c ≥ 1 c = 1 and n0= 1
1000n2+1000n ≤ 1000n2+ n2 =1001n2 c=1001 and n0 = 1000
n0= 1, because n2 = 1n2
15. More Examples …
• n4 + 100n2 + 10n + 50 is O(n4)
• 10n3 + 2n2 is O(n3)
• n3 - n2 is O(n3)
• constants
– 10 is O(1)
– 1273 is O(1)
15
16. What “Upper bound” means in big-O?
• Suppose, for the same function: f(n) = 2n2 + n
• 2n2 + n <= c . g(n2)
• We select c = 9
• 2n2 + n <= 9n2
• n <= 9n2 - 2n2
• n <= 7n2
Dividing both sides by n:
• 7 <= n ; n >= 7
• Which is true but it will not give you the upper bound!
• Upper bound is the least positive integer from where the relation
holds true for all integers greater or equal to that number.
16
17. .
• The same is true for higher functions.
• For example:
• 2n2 = O(n3):
• n = O(n2):
17
2n2 ≤ cn3 2 ≤ cn c = 1 and n0= 2
n ≤ cn2 1 ≤ cn c = 1 and n0= 1
18. No Uniqueness
• There is no unique set of values for n0 and c in proving the
asymptotic bounds
• Prove that 100n + 5 = O(n2)
– 100n + 5 ≤ 100n + n = 101n ≤ 101n2
for all n ≥ 5
n0 = 5 and c = 101 is a solution
– 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2
for all n ≥ 1
n0 = 1 and c = 105 is also a solution
Must find SOME constants c and n0 that satisfy the asymptotic notation relation
18
20. Back to Our Example
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2
• Both algorithms are of the same order: O(N)
20
21. Example (cont’d)
Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)
21
22. But, remember!
• While comparing 2 functions, we take the minimum value
for the relation to be hold true; because the greater values
has no limits.
• Comparing goldfish with shark has any meaning but you
can say that shark is bigger than Dolphins.
22
23. • 1 < lg n < √n < n < n lg n < n2 < n3 …. < 2n < 3n < … < nn
• For n2, all values on the left, serve as lower bound (but we select
maximum one for big-Omega)
• All values on the right of n2 served as upper bound, but we select
the least upper bound for Big-O)
• And what is n2? It is average bound.
23
24. Asymptotic notations (cont.)
• - notation
24
(g(n)) is the set of functions
with larger or same order of
growth as g(n)
25. Understanding Big- notation
• f(n) = (g(n)) ; f(n) >= c . g(n)
• f(n) = 2n2 + n
• 2n2 + n >= c . g(n2)
• 2n2 + n >= 2 . n2
• 2n2 + n >= 2n2
• n >= 2n2 - 2n2
• n >= 0
• n >= 0. It means n0=0 and c=2
25
26. Understanding Big- notation
• 2n2 + n <= 2n2 where n >= 0
• For n=0;
• 2 (0)2 + 0 >= 2 (0)2
• 2 (0) + 0 >= 2 (0)
• 0 = 0
• 2n2 + n >= 2n2 where n >= 0
• For n=2;
• 2 (2)2 + 1 >= 2 (2)2
• 2 (4) + 1 >= 2 (4)
• 8 + 1 >= 8
• 9 >= 8
26
27. Examples
– 5n2 = (n)
– n = (2n), n3 = (n2), n = (logn)
Whenever left function is greater than the right one in Big-omega.
27
c, n0 such that: 0 cn 5n2
cn 5n2 c = 1 and n0 = 1
33. Complexities of Iterative functions
• Consider the following loop:
for (i=1; i<=n; i++)
for (j=1; j<=n; j++)
i=1
J=1
i<=n
j<=n
i++
J++
Total =
33
34. Complexities of Iterative functions
• Consider the following loop:
for (i=1; i<=n; i++)
for (j=1; j<=n*n; j++)
i=1
J=1
i<=n
j<=n*n
i++
J++
Total =
34
35. Complexities of Iterative functions
• Consider the following loop:
for (i=1; i<=n; i++)
for (j=i; j<=n; j++)
i=1
J=i
Inner loop complexity:
Total =
35
36. Logarithms and properties
• In algorithm analysis we often use the notation “log n”
without specifying the base
n
n
n
n
e
log
ln
log
lg 2
y
x
log
36
Binary logarithm
Natural logarithm
)
lg(lg
lg
lg
)
(lg
lg
n
n
n
n k
k
x
ylog
xy
log y
x log
log
y
x
log y
x log
log
logb x
a
b
xlog
x
b
alog
log
log
a
a
x
b
37. More Examples
• For each of the following pairs of functions, either f(n) is
O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine
which relationship is correct.
– f(n) = log n2; g(n) = log n + 5
– f(n) = n; g(n) = log n2
– f(n) = log log n; g(n) = log n
– f(n) = n; g(n) = log2 n
– f(n) = n log n + n; g(n) = log n
– f(n) = 10; g(n) = log 10
– f(n) = 2n; g(n) = 10n2
– f(n) = 2n; g(n) = 3n
37
f(n) = (g(n))
f(n) = (g(n))
f(n) = O(g(n))
f(n) = (g(n))
f(n) = (g(n))
f(n) = (g(n))
f(n) = (g(n))
f(n) = O(g(n))