Slide explains concepts
1. What is Asymptotic analysis
2. Why do we need it
3. Examples of Notation
4. What are the various kinds of Asymptotic analysis
5. How to compute Big O Notation
6. Big Oh examples
2. What is an Algorithm
An algorithm is a set of well-defined instructions in sequence to solve a problem
Algorithm must have :-
1. Precise inputs & outputs
2. Clear and unambiguous steps & instructions
3. Well documented assumptions & constraints
3. What is an Data Structure
• A data structure is a particular way of organizing data.
• Structure helps to use and process data efficiently
4. Runtime Performance - Efficiency - Complexity
Efficiency of the
algorithm
Run the algorithm
Memory Consumed
Time Taken
Runtime Performance – Efficiency – Complexity
• How much time algorithm takes to run?
• How much memory algorithm take to run?
• Can we compute Efficiency of algorithm
without running?
• Can we mathematically represent
efficiency of the algorithm?
5. What is Asymptotic Notation
Mathematical Notation to represent efficiency of an algorithm
Defines the correlation between input value(s) and efficiency
of the algorithm.
Asymptotic notation is machine independent. So, it does not
consider CPU and memory of the system where algorithm
executes.
6. Why is asymptotic analysis required?
Mathematically represent runtime performance of the
algorithm
Show the correlation between input value(s) and runtime
Compare two algorithms
Coding Interviews. Interviews expect candidate to
compute the efficiency of the logic written.
7. Types of Asymptotic Analysis. Big Oh – Worst Case
Notation: O(n) where n is the input
Shows only the upper bound or worst time or space taken.
Pros:
Easy to derive
Most Commonly used
Cons:
Not Realistic
1 5 10 15
Search the array 15
100
8. Types of Asymptotic Analysis. Big Omega – Best Case
Notation: Ω(n). where n is the input
Shows only the lower bound or most optimal efficiency
of the algorithm.
Pros:
Helps represent most
efficient execution
Cons:
Does not cover all the
cases of efficiency
1 5 10 15
Search the array
1
5
9. Types of Asymptotic Analysis. Big Theta – Average Case
Notation: θ(n). where n is the input
Shows Average or realistic efficiency. (best + worst)/2.
Pros:
Helps calculate exact
or realistic efficiency
Cons:
Harder to compute
1 5 10 15
Search the array 5 15
10. Big-Oh Examples
O(1): Constant time.
• Input size does not affect(increase/decrease) efficiency
of the algorithm.
• Typically, it is a logic statement or operation.
• Example: Find a number is even or odd.
algorithm(x) {
return (x * 500 + 2);
O(n): Linear Time.
• Efficiency of the algorithm linearly
increases/decreases with input size.
• Typically, it is a loop statement which iterates n times.
• Example: Search for an element in an array
algorithm(x) {
for(int i=0;i<x;i++) {
//Assume O(1) Logic
algorithm (x [ ]) {
for(int
i=0;i<x.length;i++) {
//Assume O(1) Logic
O(logn): Log Time.
• Efficiency of the algorithm linearly increases/decreases with
input size.
• Typically, it is a loop statement which iterates less than n
times.
• This happens when loop iterator is multiplied or divided.
• Find number in first half of the array.
algorithm(x) {
for(int i=0;i<x;i=i*c) {
//Assume O(1) Logic
O(Nc): Polymorphic
• Efficiency of the algorithm polymorphically
increases/decreases with input size.
• Typically, it is a multiple loops which iterate (n pow c) times
where c is > 1
• Example: Matrix multiplication. Bubble sort.
algorithm(x) {
for(int i=0;i<x;i++) {
for(int j=0;j<x;j++) {
//Assume O(1) Logic
11. Big-Oh Examples
O(2n): Exponential.
• If input is 2 then number of operations is 2 * 2 = 4. If
input is 4 then number of operations is 2 * 2 * 2 * 2 =
16.
• Example: Find subsets of a & b ➔ ‘’, ‘a’, ‘b’, ‘ab’
O(n!): Factorial.
• If input is 3 then number of operations is 3 * 2 * 1 = 6.
If input is 4 then number of operations is 4 * 3 * 2 * 1 =
24.
• Example: find permutations of a string. abc ➔ abc,
acb, bca, bac, cab, cba
O(N2): Quadrilateral
• If executes n * n times
algorithm(x) {
for(int i=0;i<x;i++) {
for(int j=0;j<x;j=j * c) {
//Assume O(1) Logic
O(nlogn): N Log n.
• First loop iterates n times. Second loop iterated log n times.
algorithm(x) {
for(int i=0;i<x;i++) {
for(int j=0;j<x;j=j * c) {
//Assume O(1) Logic
12. Order of growth
Comparing How increase in input size affect the efficiency.
O(1) < O(logn) < O(n) < O(n logn) < O(n2) < O(n3) < O(2n) < O(n!)
Lowest Highest
13. Compute Big Oh
Rule#1: Add/multiply Complexity of each logical block.
• O(n) + O(n2) ➔ O (n + n2) //two loops in sequence
• O(2) + O(n) ➔ O(n+2) //two statements followed by loop
• O(n) * O(3) ➔ O(3n) //Loop with three statements
• O(n) * O(n) ➔ O(n2) //Loop inside loop
Rule#2: Ignore constants. Constants do NOT matter.
• O(3n) ➔ O(n),
• O(n + 5) ➔O(n)
• O(n(n/2)) ➔O(nn) ➔ O(n2)
• O(n/2) ➔ O(n)
Rule#3: Ignore lower order terms. As values get large, lower
order terms do NOT matter.
• O(n + n2) ➔ O(n2). O(n) is ignored.
• If block has O(logn) and else block has O(n). Then overall
complexity is O(n). O(logn) is ignored.
Rule#4: Alogithm with 2 or more inputs must be represented
by O(m,n)
14. Big-Oh for recursion
O(1)
O(1)
Method #1:
• Each function call will have complexity of O(1)
• So for n calls there will be n operations. So efficiency is
O(n)
Method #2:
• Draw recursion graph for some sample values.
• Formula: (complexity of each call) * (Number of branches/calls for each call) ^ (height of the call stack)
factorial (1)
factorial (2)
factorial (3)
factorial (4)
factorial (5)
O(1) * 1 ^ n ➔ O(n)
15. Big-Oh for Recursion
Method #1:
• Each function call will have complexity of O(1)
• Each call will create 2 more calls. 2 calls will create 4 more
calls.
• So for n calls efficiency will be O(2n)
Method #2:
• Formula: (complexity of each call) * (Number of branches/calls for each call) ^ (height of the call stack)
O(1) * 2 ^ n ➔ O(2n)
O(1)
O(1)
5
4
3
2
1 0
1
2
1 0
3
2
1 0
1
3
2
1 0
1
Fibonacci of 5
Fibonacci of 3
16. Big-Oh for Recursion
Method #2:
• Assume each call takes n operations (8 here).
• (complexity of each call) * (Number of branches/calls for
each call) ^ (height of the call stack)
• O(n) * 2 ^ (n/2) ➔ n logn
O(1) * 2 ^ n ➔ O(n)
n 1 2 4 8 10
O(n) 2(1/2) = 1.4 2(2/2) = 2 2(4/2) =8 2(8/2)=16 2(10/2)=32