Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...
Asymptotic Notations
1. Presentation on
Time and Space
Complexity, Average and worst
case analysis,
and Asymptotic Notations
Presented By – Mr. Rishabh Soni
Guided By – Mr. M.A. Rizvi
2. Roadmap
• Algorithmic Complexity
• Time and space complexity
• Need for Complexity Analysis
• Average and worst case analysis
• Why worst case analysis?
• The importance of Asymptotics
• Asymptotic Notations - , O, etc.
• Relations between , O,
• Comparison of functions
3. Algorithmic Complexity
Algorithmic complexity is a very important topic in computer science.
Knowing the complexity of algorithms allows you to answer questions
such as
• How long will a program run on an input?
• How much space will it take?
• Is the problem solvable?
These are important bases of comparison between different
algorithms. An understanding of algorithmic complexity provides
programmers with insight into the efficiency of their code. Complexity
is also important to several theoretical areas in computer
science, including algorithms, data structures, and complexity theory.
4. Time and Space Complexity
Time Complexity –
The time complexity is the
amount of time required by an
algorithm to execute.
It is measured in terms of
number of operations rather
than computer time; because
computer time is dependent
on the
hardware, processor, etc..
Some general order that we
may consider
O(c) < O(log n ) < O(n) < O(n log
n) < O(nc) < O(cn) < O(n!) <
O(nn), where c is some
constant.
Big-O Notation Examples of Algorithms
O(1) Push, Pop, Enqueue (if there
is a tail
reference), Dequeue, Access
ing an array element
O(log(n)) Binary search
O(n) Linear search
O(n log(n)) Heap sort, Quick sort
(average), Merge sort
O(n2) Selection sort, Insertion
sort, Bubble sort
O(n3) Matrix multiplication
O(2n) Towers of Hanoi
5. Space Complexity –
The space complexity of an algorithm is the amount of memory it needs to run
to completion.
Space complexity can be defined as : Amount of computer memory required
during the program execution, as the function of input size.
The difference between space complexity and time complexity is that the space
can be reused.
6. Complexity : Why Bother?
• Estimation/Prediction :
When you write/run a program, you need to able to predict its
requirements.
• Usual Requirements :
- execution time
- memory space
• Quantities to estimate :
- execution time -> time complexity
- memory space -> space complexity
• It is pointless to run a program that requires:
- 64TB of RAM on a desktop machine.
- 10,000 years to run
• You do not want to wait for an hour :
- for the result of your query on Google.
- when you are checking for your bank account online.
- when you are opening a picture file on Photoshop.
It is important to write efficient algorithms.
7. Average and Worst case Analysis
Worst-case complexity:
The worst case complexity is the
complexity of an algorithm when the
input is the worst possible with
respect to complexity.
Average Complexity:
The average complexity is the
complexity of an algorithm that is
averaged over all possible inputs
(assuming a uniform distribution
over the inputs).
Input
1 ms
2 ms
3 ms
4 ms
5 ms
A B C D E F G
worst-case
best-case
}average-case?
8. Why Worst Case Analysis?
Worst case running time : It is the longest running time for any input of
size n. We usually concentrate on finding only the worst-case running
time, that is, the longest running time for any input of size n, because
of the following reasons:
• The worst-case running time of an algorithm gives an upper bound
on the running time for any input. Knowing it provides a guarantee
that the algorithm will never take any longer.
• For some algorithms, the worst case occurs fairly often. For
example, in searching a database for a particular piece of
information, the searching algorithm’s worst case will often occur
when the information is not present in the database.
• The “average case” is often roughly as bad as the worst case.
9. The Importance of Asymptotics
• Asymptotic notation has many important benefits, which might not be
immediately obvious.
• An algorithm with asymptotically low running time (for example, one that
is O(n2) is beaten in the long run by an algorithm with an asymptotically
faster running time (for example, one that is O(n log n)), even if the
constant factor for the faster algorithm is worse.
Running Time Maximum Problem Size (n)
1 second 1 minute 1 hour
400 n 2,500 1,50,000 9,000,000
20n [log n] 4,096 166,666 7,826,087
2n2 707 5,477 42,426
n4 31 88 244
2n 19 25 31
10. Asymptotic Analysis
• Goal : to simplify the analysis of running time by
getting rid of “details” which may be affected by
specific implementation and hardware
like “rounding” : 1,000,001 = 1,000,000
3n2 = n2.
• Capturing the essence : how the running time of an
algorithm increases with the size of the input in the
limit.
Asymptotically more efficient algorithms are best for all
but small inputs.
11. Asymptotic Notations
• , O, , o,
• Defined for functions over the natural numbers.
– Ex: f(n) = (n2).
– Describes how f(n) grows in comparison to n2.
• Define a set of functions; in practice used to
compare two function sizes.
• Asymptotic notation is useful because it allows us
to concentrate on the main factor determining a
functions growth.
12. -notation
• notation bounds a function to
within constant factors
• Definition:
For a given function g(n), we denote
(g(n)) the set of functions
(g(n)) = { f(n) : there exists positive
constants c1, c2 and n0 such that 0 ≤ c1
g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0. }
• Explanation:
We write f(n) = (g(n)), if there exist
positive constants n0, c1, and c2 such
that at the right of n0, the value of f(n)
always lies between c1 g(n) and c2g(n)
inclusive.
• We say that g(n) is asymptotically
tight bound for f(n).
13. O-notation
• We use O-notation to give an upper
bound on a function , to within a
constant factor.
• Definition:
For a function g(n), we denote by
O(g(n)) the set of functions
O(g(n)) = {f(n) : there exists positive
constants c and n0 such that 0 ≤ f(n) ≤
c g(n) for all n ≥ n0.
• Explanation:
We Write f(n) = O(g(n)) if there are
positive constants n0 and c such that at
and to the right of n0, the value of f(n)
always lies on or below c g(n).
14. -notation
• -notation provides an asymptotic
lower bound on a function.
• Definition:
For a given function g(n), we denote
(g(n)) the set of functions
(g(n)) = {f(n) : there exists positive
constants c and n0 such that 0 ≤ c g(n) ≤
f(n) for all n ≥ n0.
• Explanation:
We write f(n) = (g(n)) if there are
positive constants n0 and c such that at
and the right of n0, the value of f(n)
always lies on or above c g(n).
16. o-notation
• The upper bound provided by O-notation may or may not be
asymptotically tight. We use o-notation to denote an upper
bound that is not asymptotically tight.
• We formally define o(g(n)) as the set
o(g(n)) = { f(n) : for any positive constant c>0, there exists
a constant n0 such that 0 ≤ f(n) < c g(n) for all n ≥ n0 }.
• In the o-notation, the function f(n) becomes insignificant
relative to g(n) as n approaches infinity; that is,
Lim [f(n) / g(n)] = 0.
n
17. -notation
• We use notation to denote a lower bound that is
not asymptotically tight.
• Formal definition:
(g(n)) = { f(n) : for any positive constant c > 0,
there exists a constant n0 such that 0 ≤ c g(n) <
f(n) for all n ≥ n0 }.
• The relation f(n) = (g(n)) implies that
Lim [f(n) / g(n)] = .n
18. Relations Between , O,
• Theorem : For any two functions g(n) and f(n),
f(n) = g(n)) iff
f(n) = O(g(n)) and f(n) = g(n)).
• i.e. (g(n)) = O(g(n)) (g(n))
• In practice, asymptotic tight bounds are
obtained from asymptotic upper and lower
bounds.
19. Comparison of Functions
f g a b
f (n) = O(g(n)) a b
f (n) = (g(n)) a b
f (n) = (g(n)) a = b
f (n) = o(g(n)) a < b
f (n) = (g(n)) a > b