SlideShare ist ein Scribd-Unternehmen logo
1 von 31
Downloaden Sie, um offline zu lesen
Introduction to Algorithms
                           6.046J/18.401J
                                      LECTURE 15
                                      Dynamic Programming
                                      • Longest common
                                        subsequence
                                      • Optimal substructure
                                      • Overlapping subproblems


                   Prof. Charles E. Leiserson
November 7, 2005    Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.1
Dynamic programming
     Design technique, like divide-and-conquer.
Example: Longest Common Subsequence (LCS)
• Given two sequences x[1 . . m] and y[1 . . n], find
  a longest subsequence common to them both.




  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.2
Dynamic programming
     Design technique, like divide-and-conquer.
Example: Longest Common Subsequence (LCS)
• Given two sequences x[1 . . m] and y[1 . . n], find
  a longest subsequence common to them both.
        “a” not “the”




  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.3
Dynamic programming
     Design technique, like divide-and-conquer.
Example: Longest Common Subsequence (LCS)
• Given two sequences x[1 . . m] and y[1 . . n], find
  a longest subsequence common to them both.
        “a” not “the”
x: A B         C    B     D A B

y: B           D       C           A           B          A


  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.4
Dynamic programming
     Design technique, like divide-and-conquer.
Example: Longest Common Subsequence (LCS)
• Given two sequences x[1 . . m] and y[1 . . n], find
  a longest subsequence common to them both.
        “a” not “the”
x: A B         C    B     D A B
                                            BCBA =
                                            LCS(x, y)
y: B      D C       A B          A
                                functional notation,
                                but not a function
  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.5
Brute-force LCS algorithm
  Check every subsequence of x[1 . . m] to see
  if it is also a subsequence of y[1 . . n].




November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.6
Brute-force LCS algorithm
  Check every subsequence of x[1 . . m] to see
  if it is also a subsequence of y[1 . . n].
  Analysis
  • Checking = O(n) time per subsequence.
  • 2m subsequences of x (each bit-vector of
    length m determines a distinct subsequence
    of x).
  Worst-case running time = O(n2m)
                           = exponential time.
November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.7
Towards a better algorithm
Simplification:
1. Look at the length of a longest-common
   subsequence.
2. Extend the algorithm to find the LCS itself.




November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.8
Towards a better algorithm
Simplification:
1. Look at the length of a longest-common
   subsequence.
2. Extend the algorithm to find the LCS itself.
Notation: Denote the length of a sequence s
by | s |.




November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.9
Towards a better algorithm
Simplification:
1. Look at the length of a longest-common
   subsequence.
2. Extend the algorithm to find the LCS itself.
Notation: Denote the length of a sequence s
by | s |.
Strategy: Consider prefixes of x and y.
• Define c[i, j] = | LCS(x[1 . . i], y[1 . . j]) |.
• Then, c[m, n] = | LCS(x, y) |.
November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.10
Recursive formulation
Theorem.
                    c[i–1, j–1] + 1           if x[i] = y[j],
 c[i, j] =          max{c[i–1, j], c[i, j–1]} otherwise.




 November 7, 2005    Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.11
Recursive formulation
Theorem.
           c[i–1, j–1] + 1           if x[i] = y[j],
 c[i, j] = max{c[i–1, j], c[i, j–1]} otherwise.
Proof. Case x[i] = y[ j]:
             1       2                   i                                                m
      x:                                                    L
             1       2                       =      j                      n
      y:                                                       L




  November 7, 2005       Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson       L15.12
Recursive formulation
Theorem.
           c[i–1, j–1] + 1           if x[i] = y[j],
 c[i, j] = max{c[i–1, j], c[i, j–1]} otherwise.
Proof. Case x[i] = y[ j]:
              1       2                   i                                                m
       x:                                                    L
              1       2                       =      j                      n
       y:                                                       L
Let z[1 . . k] = LCS(x[1 . . i], y[1 . . j]), where c[i, j]
= k. Then, z[k] = x[i], or else z could be extended.
Thus, z[1 . . k–1] is CS of x[1 . . i–1] and y[1 . . j–1].
   November 7, 2005       Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson       L15.13
Proof (continued)
Claim: z[1 . . k–1] = LCS(x[1 . . i–1], y[1 . . j–1]).
  Suppose w is a longer CS of x[1 . . i–1] and
  y[1 . . j–1], that is, | w | > k–1. Then, cut and
  paste: w || z[k] (w concatenated with z[k]) is a
  common subsequence of x[1 . . i] and y[1 . . j]
  with | w || z[k] | > k. Contradiction, proving the
  claim.




  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.14
Proof (continued)
Claim: z[1 . . k–1] = LCS(x[1 . . i–1], y[1 . . j–1]).
  Suppose w is a longer CS of x[1 . . i–1] and
  y[1 . . j–1], that is, | w | > k–1. Then, cut and
  paste: w || z[k] (w concatenated with z[k]) is a
  common subsequence of x[1 . . i] and y[1 . . j]
  with | w || z[k] | > k. Contradiction, proving the
  claim.
Thus, c[i–1, j–1] = k–1, which implies that c[i, j]
= c[i–1, j–1] + 1.
Other cases are similar.
  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.15
Dynamic-programming
         hallmark #1

                   Optimal substructure
              An optimal solution to a problem
                (instance) contains optimal
                 solutions to subproblems.




November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.16
Dynamic-programming
         hallmark #1

                   Optimal substructure
              An optimal solution to a problem
                (instance) contains optimal
                 solutions to subproblems.


     If z = LCS(x, y), then any prefix of z is
     an LCS of a prefix of x and a prefix of y.

November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.17
Recursive algorithm for LCS
  LCS(x, y, i, j)
    if x[i] = y[ j]
        then c[i, j] ← LCS(x, y, i–1, j–1) + 1
        else c[i, j] ← max{ LCS(x, y, i–1, j),
                            LCS(x, y, i, j–1)}




November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.18
Recursive algorithm for LCS
  LCS(x, y, i, j)
    if x[i] = y[ j]
        then c[i, j] ← LCS(x, y, i–1, j–1) + 1
        else c[i, j] ← max{ LCS(x, y, i–1, j),
                            LCS(x, y, i, j–1)}
  Worst-case: x[i] ≠ y[ j], in which case the
  algorithm evaluates two subproblems, each
  with only one parameter decremented.

November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.19
Recursion tree
m = 3, n = 4:                            3,4
                                          3,4

            2,4
            2,4                                                        3,3
                                                                        3,3

    1,4
     1,4               2,3
                        2,3                                    2,3
                                                                2,3              3,2
                                                                                  3,2

                1,3
                 1,3          2,2
                              2,2                      1,3
                                                        1,3           2,2
                                                                       2,2




  November 7, 2005     Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.20
Recursion tree
m = 3, n = 4:                            3,4
                                          3,4

            2,4
            2,4                                                        3,3
                                                                        3,3

    1,4
     1,4               2,3
                        2,3                                    2,3
                                                                2,3              3,2
                                                                                  3,2    m+n

                1,3
                 1,3          2,2
                              2,2                      1,3
                                                        1,3           2,2
                                                                       2,2



 Height = m + n ⇒ work potentially exponential.

  November 7, 2005     Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.21
Recursion tree
m = 3, n = 4:                            3,4
                                          3,4

            2,4
            2,4                   same                                 3,3
                                                                        3,3
                               subproblem
    1,4
     1,4               2,3
                        2,3                                    2,3
                                                                2,3              3,2
                                                                                  3,2    m+n

                1,3
                 1,3          2,2
                              2,2                      1,3
                                                        1,3           2,2
                                                                       2,2



 Height = m + n ⇒ work potentially exponential.,
 but we’re solving subproblems already solved!
  November 7, 2005     Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.22
Dynamic-programming
         hallmark #2

                Overlapping subproblems
              A recursive solution contains a
                “small” number of distinct
            subproblems repeated many times.




November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.23
Dynamic-programming
         hallmark #2

                Overlapping subproblems
              A recursive solution contains a
                “small” number of distinct
            subproblems repeated many times.


  The number of distinct LCS subproblems for
  two strings of lengths m and n is only mn.


November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.24
Memoization algorithm
Memoization: After computing a solution to a
subproblem, store it in a table. Subsequent calls
check the table to avoid redoing work.




November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.25
Memoization algorithm
Memoization: After computing a solution to a
subproblem, store it in a table. Subsequent calls
check the table to avoid redoing work.
LCS(x, y, i, j)
  if c[i, j] = NIL
      then if x[i] = y[j]
           then c[i, j] ← LCS(x, y, i–1, j–1) + 1                                    same
           else c[i, j] ← max{ LCS(x, y, i–1, j),                                    as
                                                                                     before
                               LCS(x, y, i, j–1)}


 November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson    L15.26
Memoization algorithm
Memoization: After computing a solution to a
subproblem, store it in a table. Subsequent calls
check the table to avoid redoing work.
LCS(x, y, i, j)
  if c[i, j] = NIL
      then if x[i] = y[j]
           then c[i, j] ← LCS(x, y, i–1, j–1) + 1                                    same
           else c[i, j] ← max{ LCS(x, y, i–1, j),                                    as
                                                                                     before
                               LCS(x, y, i, j–1)}
Time = Θ(mn) = constant work per table entry.
Space = Θ(mn).
 November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson    L15.27
Dynamic-programming
           algorithm
IDEA:                              A B C B D                                          A B
Compute the                      0 0 0 0 0 0
                                 0 0 0 0 0 0                                          0 0
                                                                                      0 0
table bottom-up.
                               B 0 0 1 1 1 1
                                 0 0 1 1 1 1                                          1 1
                                                                                      1 1
                               D 0 0 1 1 1 2
                                 0 0 1 1 1 2                         2 2
                                                                     2 2
                               C 0 0 1
                                 0 0 1                         2 2 2 2 2
                                                               2 2 2 2 2
                               A 0 1 1
                                 0 1 1                         2 2 2 3 3
                                                               2 2 2 3 3
                               B 0 1 2
                                 0 1 2                         2 3 3 3 4
                                                               2 3 3 3 4
                               A 0 1 2
                                 0 1 2                         2 3 3 4 4
                                                               2 3 3 4 4
  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.28
Dynamic-programming
           algorithm
IDEA:                              A B C B D                                          A B
Compute the                      0 0 0 0 0 0
                                 0 0 0 0 0 0                                          0 0
                                                                                      0 0
table bottom-up.
                               B 0 0 1 1 1 1
                                 0 0 1 1 1 1                                          1 1
                                                                                      1 1
Time = Θ(mn).                  D 0 0 1 1 1 2                         2 2
                                 0 0 1 1 1 2                         2 2
                               C 0 0 1
                                 0 0 1                         2 2 2 2 2
                                                               2 2 2 2 2
                               A 0 1 1
                                 0 1 1                         2 2 2 3 3
                                                               2 2 2 3 3
                               B 0 1 2
                                 0 1 2                         2 3 3 3 4
                                                               2 3 3 3 4
                               A 0 1 2
                                 0 1 2                         2 3 3 4 4
                                                               2 3 3 4 4
  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.29
Dynamic-programming
           algorithm
IDEA:                              A B C B D                                          A B
Compute the                      0 0 0 0 0 0
                                 0 0 0 0 0 0                                          0 0
                                                                                      0 0
table bottom-up.
                               B 0 0 1 1 1 1
                                 0 0 1 1 1 1                                          1 1
                                                                                      1 1
Time = Θ(mn).                  D 0 0 1 1 1 2                         2 2
                                 0 0 1 1 1 2                         2 2
Reconstruct
                               C 0 0 1
                                 0 0 1                         2 2 2 2 2
                                                               2 2 2 2 2
LCS by tracing
backwards.                     A 0 1 1
                                 0 1 1                         2 2 2 3 3
                                                               2 2 2 3 3
                               B 0 1 2
                                 0 1 2                         2 3 3 3 4
                                                               2 3 3 3 4
                               A 0 1 2
                                 0 1 2                         2 3 3 4 4
                                                               2 3 3 4 4
  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.30
Dynamic-programming
           algorithm
IDEA:                              A B C B D                                          A B
Compute the                      0 0 0 0 0 0
                                 0 0 0 0 0 0                                          0 0
                                                                                      0 0
table bottom-up.
                               B 0 0 1 1 1 1
                                 0 0 1 1 1 1                                          1 1
                                                                                      1 1
Time = Θ(mn).                  D 0 0 1 1 1 2                         2 2
                                 0 0 1 1 1 2                         2 2
Reconstruct
                               C 0 0 1
                                 0 0 1                         2 2 2 2 2
                                                               2 2 2 2 2
LCS by tracing
backwards.                     A 0 1 1
                                 0 1 1                         2 2 2 3 3
                                                               2 2 2 3 3
Space = Θ(mn).                 B 0 1 2
                                 0 1 2                         2 3 3 3 4
                                                               2 3 3 3 4
Exercise:                      A 0 1 2
                                 0 1 2                         2 3 3 4 4
                                                               2 3 3 4 4
O(min{m, n}).
  November 7, 2005   Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson   L15.31

Weitere ähnliche Inhalte

Was ist angesagt?

Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Alexander Litvinenko
 
Dynamic1
Dynamic1Dynamic1
Dynamic1MyAlome
 
Ee693 questionshomework
Ee693 questionshomeworkEe693 questionshomework
Ee693 questionshomeworkGopi Saiteja
 
Longest common subsequence lcs
Longest common subsequence  lcsLongest common subsequence  lcs
Longest common subsequence lcsShahariar Rabby
 
Answers withexplanations
Answers withexplanationsAnswers withexplanations
Answers withexplanationsGopi Saiteja
 
Ee693 sept2014midsem
Ee693 sept2014midsemEe693 sept2014midsem
Ee693 sept2014midsemGopi Saiteja
 
Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01Cheng Feng
 
Dynamic programming
Dynamic programmingDynamic programming
Dynamic programmingShakil Ahmed
 
On approximate bounds of zeros of polynomials within
On approximate bounds of zeros of polynomials withinOn approximate bounds of zeros of polynomials within
On approximate bounds of zeros of polynomials withineSAT Publishing House
 
ON THE COVERING RADIUS OF CODES OVER Z4 WITH CHINESE EUCLIDEAN WEIGHT
ON THE COVERING RADIUS OF CODES OVER Z4 WITH CHINESE EUCLIDEAN WEIGHTON THE COVERING RADIUS OF CODES OVER Z4 WITH CHINESE EUCLIDEAN WEIGHT
ON THE COVERING RADIUS OF CODES OVER Z4 WITH CHINESE EUCLIDEAN WEIGHTijitjournal
 
Iit jam 2016 physics solutions BY Trajectoryeducation
Iit jam 2016 physics solutions BY TrajectoryeducationIit jam 2016 physics solutions BY Trajectoryeducation
Iit jam 2016 physics solutions BY TrajectoryeducationDev Singh
 
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonSolutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonPamelaew
 
Mba admission in india
Mba admission in indiaMba admission in india
Mba admission in indiaEdhole.com
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationDev Singh
 
01 knapsack using backtracking
01 knapsack using backtracking01 knapsack using backtracking
01 knapsack using backtrackingmandlapure
 

Was ist angesagt? (20)

Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 
Dynamic1
Dynamic1Dynamic1
Dynamic1
 
Ee693 questionshomework
Ee693 questionshomeworkEe693 questionshomework
Ee693 questionshomework
 
Longest common subsequence lcs
Longest common subsequence  lcsLongest common subsequence  lcs
Longest common subsequence lcs
 
Answers withexplanations
Answers withexplanationsAnswers withexplanations
Answers withexplanations
 
Ee693 sept2014midsem
Ee693 sept2014midsemEe693 sept2014midsem
Ee693 sept2014midsem
 
Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01
 
Dynamic programming
Dynamic programmingDynamic programming
Dynamic programming
 
Functions limits and continuity
Functions limits and continuityFunctions limits and continuity
Functions limits and continuity
 
On approximate bounds of zeros of polynomials within
On approximate bounds of zeros of polynomials withinOn approximate bounds of zeros of polynomials within
On approximate bounds of zeros of polynomials within
 
Jacobson Theorem
Jacobson TheoremJacobson Theorem
Jacobson Theorem
 
ON THE COVERING RADIUS OF CODES OVER Z4 WITH CHINESE EUCLIDEAN WEIGHT
ON THE COVERING RADIUS OF CODES OVER Z4 WITH CHINESE EUCLIDEAN WEIGHTON THE COVERING RADIUS OF CODES OVER Z4 WITH CHINESE EUCLIDEAN WEIGHT
ON THE COVERING RADIUS OF CODES OVER Z4 WITH CHINESE EUCLIDEAN WEIGHT
 
Section5 stochastic
Section5 stochasticSection5 stochastic
Section5 stochastic
 
Pixelrelationships
PixelrelationshipsPixelrelationships
Pixelrelationships
 
Iit jam 2016 physics solutions BY Trajectoryeducation
Iit jam 2016 physics solutions BY TrajectoryeducationIit jam 2016 physics solutions BY Trajectoryeducation
Iit jam 2016 physics solutions BY Trajectoryeducation
 
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonSolutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
 
Mba admission in india
Mba admission in indiaMba admission in india
Mba admission in india
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY Trajectoryeducation
 
01 knapsack using backtracking
01 knapsack using backtracking01 knapsack using backtracking
01 knapsack using backtracking
 
Talk5
Talk5Talk5
Talk5
 

Ähnlich wie Mychurch File Upload

17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog217-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2Shanmuganathan C
 
Longest common subsequence
Longest common subsequenceLongest common subsequence
Longest common subsequenceKiran K
 
Longest common subsequence(dynamic programming).
Longest common subsequence(dynamic programming).Longest common subsequence(dynamic programming).
Longest common subsequence(dynamic programming).munawerzareef
 
mit 演算法 Lec2
mit 演算法 Lec2 mit 演算法 Lec2
mit 演算法 Lec2 Henry Chang
 
Tensor Decomposition and its Applications
Tensor Decomposition and its ApplicationsTensor Decomposition and its Applications
Tensor Decomposition and its ApplicationsKeisuke OTAKI
 
15_wk4_unsupervised-learning_manifold-EM-cs365-2014.pdf
15_wk4_unsupervised-learning_manifold-EM-cs365-2014.pdf15_wk4_unsupervised-learning_manifold-EM-cs365-2014.pdf
15_wk4_unsupervised-learning_manifold-EM-cs365-2014.pdfMcSwathi
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?Christian Robert
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorSSA KPI
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
Csr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCsr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCSR2011
 
Kernels and Support Vector Machines
Kernels and Support Vector  MachinesKernels and Support Vector  Machines
Kernels and Support Vector MachinesEdgar Marca
 
slides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadslides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadFarhad Gholami
 
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsA T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsVjekoslavKovac1
 

Ähnlich wie Mychurch File Upload (20)

17-dynprog2.ppt
17-dynprog2.ppt17-dynprog2.ppt
17-dynprog2.ppt
 
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog217-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
 
Longest common subsequence
Longest common subsequenceLongest common subsequence
Longest common subsequence
 
Longest common subsequence(dynamic programming).
Longest common subsequence(dynamic programming).Longest common subsequence(dynamic programming).
Longest common subsequence(dynamic programming).
 
AmortizedAnalysis
AmortizedAnalysisAmortizedAnalysis
AmortizedAnalysis
 
mit 演算法 Lec2
mit 演算法 Lec2 mit 演算法 Lec2
mit 演算法 Lec2
 
Presentation OCIP2014
Presentation OCIP2014Presentation OCIP2014
Presentation OCIP2014
 
Tensor Decomposition and its Applications
Tensor Decomposition and its ApplicationsTensor Decomposition and its Applications
Tensor Decomposition and its Applications
 
15_wk4_unsupervised-learning_manifold-EM-cs365-2014.pdf
15_wk4_unsupervised-learning_manifold-EM-cs365-2014.pdf15_wk4_unsupervised-learning_manifold-EM-cs365-2014.pdf
15_wk4_unsupervised-learning_manifold-EM-cs365-2014.pdf
 
Evaluating definite integrals
Evaluating definite integralsEvaluating definite integrals
Evaluating definite integrals
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial Sector
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
 
Csr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCsr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawal
 
Kernels and Support Vector Machines
Kernels and Support Vector  MachinesKernels and Support Vector  Machines
Kernels and Support Vector Machines
 
Lec3 Algott
Lec3 AlgottLec3 Algott
Lec3 Algott
 
slides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadslides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhad
 
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsA T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
 
Week 12
Week 12Week 12
Week 12
 
Fol
FolFol
Fol
 

Mehr von Joe Suh

Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File UploadJoe Suh
 

Mehr von Joe Suh (20)

Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
test
testtest
test
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 
Mychurch File Upload
Mychurch File UploadMychurch File Upload
Mychurch File Upload
 

Mychurch File Upload

  • 1. Introduction to Algorithms 6.046J/18.401J LECTURE 15 Dynamic Programming • Longest common subsequence • Optimal substructure • Overlapping subproblems Prof. Charles E. Leiserson November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.1
  • 2. Dynamic programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) • Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.2
  • 3. Dynamic programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) • Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. “a” not “the” November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.3
  • 4. Dynamic programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) • Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. “a” not “the” x: A B C B D A B y: B D C A B A November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.4
  • 5. Dynamic programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) • Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. “a” not “the” x: A B C B D A B BCBA = LCS(x, y) y: B D C A B A functional notation, but not a function November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.5
  • 6. Brute-force LCS algorithm Check every subsequence of x[1 . . m] to see if it is also a subsequence of y[1 . . n]. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.6
  • 7. Brute-force LCS algorithm Check every subsequence of x[1 . . m] to see if it is also a subsequence of y[1 . . n]. Analysis • Checking = O(n) time per subsequence. • 2m subsequences of x (each bit-vector of length m determines a distinct subsequence of x). Worst-case running time = O(n2m) = exponential time. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.7
  • 8. Towards a better algorithm Simplification: 1. Look at the length of a longest-common subsequence. 2. Extend the algorithm to find the LCS itself. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.8
  • 9. Towards a better algorithm Simplification: 1. Look at the length of a longest-common subsequence. 2. Extend the algorithm to find the LCS itself. Notation: Denote the length of a sequence s by | s |. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.9
  • 10. Towards a better algorithm Simplification: 1. Look at the length of a longest-common subsequence. 2. Extend the algorithm to find the LCS itself. Notation: Denote the length of a sequence s by | s |. Strategy: Consider prefixes of x and y. • Define c[i, j] = | LCS(x[1 . . i], y[1 . . j]) |. • Then, c[m, n] = | LCS(x, y) |. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.10
  • 11. Recursive formulation Theorem. c[i–1, j–1] + 1 if x[i] = y[j], c[i, j] = max{c[i–1, j], c[i, j–1]} otherwise. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.11
  • 12. Recursive formulation Theorem. c[i–1, j–1] + 1 if x[i] = y[j], c[i, j] = max{c[i–1, j], c[i, j–1]} otherwise. Proof. Case x[i] = y[ j]: 1 2 i m x: L 1 2 = j n y: L November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.12
  • 13. Recursive formulation Theorem. c[i–1, j–1] + 1 if x[i] = y[j], c[i, j] = max{c[i–1, j], c[i, j–1]} otherwise. Proof. Case x[i] = y[ j]: 1 2 i m x: L 1 2 = j n y: L Let z[1 . . k] = LCS(x[1 . . i], y[1 . . j]), where c[i, j] = k. Then, z[k] = x[i], or else z could be extended. Thus, z[1 . . k–1] is CS of x[1 . . i–1] and y[1 . . j–1]. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.13
  • 14. Proof (continued) Claim: z[1 . . k–1] = LCS(x[1 . . i–1], y[1 . . j–1]). Suppose w is a longer CS of x[1 . . i–1] and y[1 . . j–1], that is, | w | > k–1. Then, cut and paste: w || z[k] (w concatenated with z[k]) is a common subsequence of x[1 . . i] and y[1 . . j] with | w || z[k] | > k. Contradiction, proving the claim. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.14
  • 15. Proof (continued) Claim: z[1 . . k–1] = LCS(x[1 . . i–1], y[1 . . j–1]). Suppose w is a longer CS of x[1 . . i–1] and y[1 . . j–1], that is, | w | > k–1. Then, cut and paste: w || z[k] (w concatenated with z[k]) is a common subsequence of x[1 . . i] and y[1 . . j] with | w || z[k] | > k. Contradiction, proving the claim. Thus, c[i–1, j–1] = k–1, which implies that c[i, j] = c[i–1, j–1] + 1. Other cases are similar. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.15
  • 16. Dynamic-programming hallmark #1 Optimal substructure An optimal solution to a problem (instance) contains optimal solutions to subproblems. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.16
  • 17. Dynamic-programming hallmark #1 Optimal substructure An optimal solution to a problem (instance) contains optimal solutions to subproblems. If z = LCS(x, y), then any prefix of z is an LCS of a prefix of x and a prefix of y. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.17
  • 18. Recursive algorithm for LCS LCS(x, y, i, j) if x[i] = y[ j] then c[i, j] ← LCS(x, y, i–1, j–1) + 1 else c[i, j] ← max{ LCS(x, y, i–1, j), LCS(x, y, i, j–1)} November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.18
  • 19. Recursive algorithm for LCS LCS(x, y, i, j) if x[i] = y[ j] then c[i, j] ← LCS(x, y, i–1, j–1) + 1 else c[i, j] ← max{ LCS(x, y, i–1, j), LCS(x, y, i, j–1)} Worst-case: x[i] ≠ y[ j], in which case the algorithm evaluates two subproblems, each with only one parameter decremented. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.19
  • 20. Recursion tree m = 3, n = 4: 3,4 3,4 2,4 2,4 3,3 3,3 1,4 1,4 2,3 2,3 2,3 2,3 3,2 3,2 1,3 1,3 2,2 2,2 1,3 1,3 2,2 2,2 November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.20
  • 21. Recursion tree m = 3, n = 4: 3,4 3,4 2,4 2,4 3,3 3,3 1,4 1,4 2,3 2,3 2,3 2,3 3,2 3,2 m+n 1,3 1,3 2,2 2,2 1,3 1,3 2,2 2,2 Height = m + n ⇒ work potentially exponential. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.21
  • 22. Recursion tree m = 3, n = 4: 3,4 3,4 2,4 2,4 same 3,3 3,3 subproblem 1,4 1,4 2,3 2,3 2,3 2,3 3,2 3,2 m+n 1,3 1,3 2,2 2,2 1,3 1,3 2,2 2,2 Height = m + n ⇒ work potentially exponential., but we’re solving subproblems already solved! November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.22
  • 23. Dynamic-programming hallmark #2 Overlapping subproblems A recursive solution contains a “small” number of distinct subproblems repeated many times. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.23
  • 24. Dynamic-programming hallmark #2 Overlapping subproblems A recursive solution contains a “small” number of distinct subproblems repeated many times. The number of distinct LCS subproblems for two strings of lengths m and n is only mn. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.24
  • 25. Memoization algorithm Memoization: After computing a solution to a subproblem, store it in a table. Subsequent calls check the table to avoid redoing work. November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.25
  • 26. Memoization algorithm Memoization: After computing a solution to a subproblem, store it in a table. Subsequent calls check the table to avoid redoing work. LCS(x, y, i, j) if c[i, j] = NIL then if x[i] = y[j] then c[i, j] ← LCS(x, y, i–1, j–1) + 1 same else c[i, j] ← max{ LCS(x, y, i–1, j), as before LCS(x, y, i, j–1)} November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.26
  • 27. Memoization algorithm Memoization: After computing a solution to a subproblem, store it in a table. Subsequent calls check the table to avoid redoing work. LCS(x, y, i, j) if c[i, j] = NIL then if x[i] = y[j] then c[i, j] ← LCS(x, y, i–1, j–1) + 1 same else c[i, j] ← max{ LCS(x, y, i–1, j), as before LCS(x, y, i, j–1)} Time = Θ(mn) = constant work per table entry. Space = Θ(mn). November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.27
  • 28. Dynamic-programming algorithm IDEA: A B C B D A B Compute the 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 table bottom-up. B 0 0 1 1 1 1 0 0 1 1 1 1 1 1 1 1 D 0 0 1 1 1 2 0 0 1 1 1 2 2 2 2 2 C 0 0 1 0 0 1 2 2 2 2 2 2 2 2 2 2 A 0 1 1 0 1 1 2 2 2 3 3 2 2 2 3 3 B 0 1 2 0 1 2 2 3 3 3 4 2 3 3 3 4 A 0 1 2 0 1 2 2 3 3 4 4 2 3 3 4 4 November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.28
  • 29. Dynamic-programming algorithm IDEA: A B C B D A B Compute the 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 table bottom-up. B 0 0 1 1 1 1 0 0 1 1 1 1 1 1 1 1 Time = Θ(mn). D 0 0 1 1 1 2 2 2 0 0 1 1 1 2 2 2 C 0 0 1 0 0 1 2 2 2 2 2 2 2 2 2 2 A 0 1 1 0 1 1 2 2 2 3 3 2 2 2 3 3 B 0 1 2 0 1 2 2 3 3 3 4 2 3 3 3 4 A 0 1 2 0 1 2 2 3 3 4 4 2 3 3 4 4 November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.29
  • 30. Dynamic-programming algorithm IDEA: A B C B D A B Compute the 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 table bottom-up. B 0 0 1 1 1 1 0 0 1 1 1 1 1 1 1 1 Time = Θ(mn). D 0 0 1 1 1 2 2 2 0 0 1 1 1 2 2 2 Reconstruct C 0 0 1 0 0 1 2 2 2 2 2 2 2 2 2 2 LCS by tracing backwards. A 0 1 1 0 1 1 2 2 2 3 3 2 2 2 3 3 B 0 1 2 0 1 2 2 3 3 3 4 2 3 3 3 4 A 0 1 2 0 1 2 2 3 3 4 4 2 3 3 4 4 November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.30
  • 31. Dynamic-programming algorithm IDEA: A B C B D A B Compute the 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 table bottom-up. B 0 0 1 1 1 1 0 0 1 1 1 1 1 1 1 1 Time = Θ(mn). D 0 0 1 1 1 2 2 2 0 0 1 1 1 2 2 2 Reconstruct C 0 0 1 0 0 1 2 2 2 2 2 2 2 2 2 2 LCS by tracing backwards. A 0 1 1 0 1 1 2 2 2 3 3 2 2 2 3 3 Space = Θ(mn). B 0 1 2 0 1 2 2 3 3 3 4 2 3 3 3 4 Exercise: A 0 1 2 0 1 2 2 3 3 4 4 2 3 3 4 4 O(min{m, n}). November 7, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L15.31