SlideShare ist ein Scribd-Unternehmen logo
1 von 36
1




 Linear Programming and its Usage in
Approximation Algorithms for NP Hard
        Optimization Problems
             M. Reza Rahimi,
             November 2005.
2




               Outline

          • Introduction
          • Linear Programming Overview
          • NP Complexity Class
          • Approximation Algorithms
          • Case Study1: Minimum Weight Vertex Cover
          • Case Study2: MAXSAT Problem
          • Conclusion


Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
3




                  Introduction
   • One of the most challenging problem in complexity
     theory is problem P vs. NP.
                             NP
   • Research on this open problem leads researchers to
     think of new methods and different approaches.
   • For example PCP, IP, approximation algorithms,...
   • For some problems it is proved that there does not
     exist any suitable approximation unless P=NP.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
4




   • So it seems that research on approximation
     algorithms may lead us to good results about P vs.
     NP.
   • General Method for Approximation Algorithms of
     NP Hard Optimization is Greedy Method.
   • But we must search for Suitable Framework for
     studying NP problems.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
5




       • Linear Programming was this brake through
         method.
       • In this talk I focus on LP and its usage in NP
         Hard Optimization Problems.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
6




                Linear Programming Overview
            • General formulation of linear programming is
              presented as follow:


                  Maximize            C1X1 + C2X2 + … + CdXd
                  Subject to A1,1X1 + … + A1,dXd ≤ b1
                                      A2,1X1 + … + A2,dXd ≤ b2
                                            …
                                      An,1X1 + … + An,dXd ≤ bn

                  All calculation and Numbers are on Real Numbers.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
7




      • There are several Polynomial Algorithms for this
        problem such as Ellipsoid and Interior point
        method.
        method
      • I neglect talking about them.
      • For these algorithms please refer to optimization
        references.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
8




                  NP Complexity Class

    • NP Complexity Class is only related to Decision
      Problem.
    • For example:

             SAT = {< Φ >: Φ is a satisfying assignment}.

    • So for any optimization problem we consider its
      related decision problem.
    • This will give us intuition about its hardness.



Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
9




       • We know the following definition about NP:

                       L ∈ NP ⇔ ∃V(.,.) ∈ P, ∃P(.), ∀x ∈ Σ∗ ,
                                      1. x ∈ L ⇒ ∃y, y ≤ P( x ) and V(x, y) accepts.
                                      2. x ∉ L ⇒ ∀y, y ≤ P( x ) and V(x, y) rejects.


       • We can look at this process like this:


                                                          Y
                  Find Certificate Y                  y                     V(X,Y)

                                  X                                          x
                                                                                        X
                                      x
Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
10




      • Example:

                SAT = {< Φ >: Φ is a satisfying assignment}.
                SAT ∈ NP Because :
               1) if Φ 0 ∈ SAT ⇒ We are given True assignment
                , and could check it in polynomial time.
                2)if Φ 0 ∉ SAT ⇒ Then there is no true assignment.

      • There is one another important concept in
        complexity which is Reduction.



Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
11




                Definition :
                L ≤ P L* ⇔ ∃f (Polynomial Time Function) ∀x ∈ L ⇔ f ( x) ∈ L* .



                                                        f
                                L                                              L*
                                                        f
                                L      L
                                                                                L*

              Fig1: Graphical Diagram of Reduction Concept

Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
12




          Example :
          SAT = {< Φ >| Φ is satisfiable assignment}.
          3 - CNF = {< Φ >| Φ is 3 - cnf and satisfiable}.
          we show that :
                            SAT ≤ P 3 − CNF .
          (3 − CNF is in this form for example :
             (x1 ∨ x 2 ∨ x 3 ) ∧ ( x1 ∨ x 2 ∨ x 3 ) ∧ (x1 ∨ x 2 ∨ x 3 )


Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
13




                             Φ ≡ (¬x1 → x2 ) ↔ ( x3 ∧ x2 )

                                                     <->
                                         y1                        y2

                                    ->                                  ^

                     -x1                      x2            x3                    x4

              Φ ≡ ( y1 ↔ (¬x1 → x2 )) ∧ (( x3 ∧ x4 ) ↔ y2 ) ∧ ( y1 ↔ y2 )
              Now each paranthesis can be converted into 3 - OR form
              using truth table.

Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
14




   Example :
   INTEGER − PROGRAMMING = {< Am×n , bm×1 >| Am×n and bm×1 are integer matrices,
   ∃X n×1 ∈ Ζ n , Am×n X n×1 ≤ bm×1}.
   Obviously INTEGER − PROGRAMMING ∈ NP.
   We will show that :
   3 − CNF ≤ P INTEGER − PROGRAMMING.
   Proof :
                             ¬X 1 ∨ X 2 ∨ X 3 ↔ (1 − X 1 ) + X 2 + X 3
                             ¬X 1 ∨ X 2 ∨ X 3 ≡ 1 ↔ (1 − X 1 ) + X 2 + X 3 ≥ 1
                                            X i ∈ Ζ, 0 ≤ X i ≤ 1




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
15




   • We conclude this section by the following theorem:
                         Every language in NP is polynomial time
                             reducible to SAT language.
   • Language like SAT, INTEGER-PROGRAMMING,
     Hamiltonian Cycle, and 3-CNF are also the same
     and they belong to NP-Complete set.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
16




                  Approximation Algorithms

      • The following describes the connection of NP
        optimization Complete problem and its related
        optimization problem.

      Decision Version :
      TSP = {< Gn×n , b >| G n×n is complete non negative weighted graph and there
      exists Hamiltonian Cycle such that its cost is less than or
      equal b}.
      Optimization Version :
      OP − TSP :
      we have complete non negative weighted graph G n×n , find the Hamiltonian Cycle
       which its cost is minimun.


Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
17




• Bound of Approximation:
   • Suppose that problem P has an optimum solution with cost C-
     op. Algorithm X solves problem P
     with bound ρ(n) if it finds feasible solution with cost C such
     as:
                      max {c/c-op, c-op/c} ≤ ρ(n).
   • Note that with this definition we always have ρ(n) ≥1.
   • In the following I try to show that sometimes, finding good
     approximation for optimization problems is very hard.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
18




 • Theorem:
      It does not exist any polynomial time approximation
      algorithm with Polynomial Bound for OPT-TSP unless
      P=NP.
 • Proof:
     we prove that if such an algorithm exists then we can Solve
     Hamiltonian Cycle in Polynomial Time.
        For each graph G with n vertex convert
        It into weighted complete graph G1 as                     G has Hamiltonian Cycle
        the following procedure:                                            Iff
        4)     Assign 1 to each of its edge.                          OPT-TSP(G1)=n
        5)     Assign nρ(n) to the other edges.


Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
19




  • I think that it’s now time to study Randomized
    Rounding technique for solving optimization problems.
  • I explore it with two examples.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
20




                Case Study1: Minimum Weight Vertex
                Cover
• Now we study randomized rounding method in
  approximation algorithms with an example.
• Definition:
   • Vertex Cover: In undirected graph G=(V,E), Set A of
     vertices is said vertex cover if
     for every (u,v) ε E then u ε A or v ε A .
   • Minimum Weight Vertex Cover: In undirected graph
     G=(V,E) which each vertex has w(v) as positive weight Set
     A of vertices is said minimum weight vertex cover if for
     every (u,v) ε E then u ε A or v ε A and w(A) is the least.



Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
21




                                     Fig2: Vertex Cover

Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
22




           • It is proved that decision version of this
             optimization problem is NP Hard.
           • At the first step we model it as integer
             programming.

                       x : V → {0,1}
                       if v ∈ Minimum Weight Vertex Cover Set ⇒ x(v) = 1.
                       else x(v) = 0.
                       Integer Programming Method :
                       min ∑ w(v)x(v)
                            v∈V
                       ∀u , v ∈ V x(v) + x (u ) ≥ 1
                                       0 ≤ x (v ) ≤ 1


Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
23




       • As it was known it is believed that there is no
         polynomial time algorithm for integer programming.
       • We must think about another method.
       • We investigate its related linear programming.


                                 x : V → [0,1]
                                 Linear Programming Method :
                                 min ∑ w(v)x(v)
                                      v∈V
                                 ∀u , v ∈ V x(v) + x(u ) ≥ 1
                                                   0 ≤ x (v ) ≤ 1


Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
24




       • The Last method is called Relaxation to linear
         programming.
         programming
       • It’s now time to state the approximation
             algorithm.
                          Approximation Min Weight Vertex Cover(G, w)
                         1) C ← Φ
                          2) Compute x an optimal solution to linear programming.
                          3) for each v ∈ V
                          4) do if x (v) ≥ 0.5        / * Rounding Method * /
                          5)        then C ← C ∪ {v}
                          6) Return C.



Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
25




             Proof :
             C :: The soulution of algorithm.
             C* :: The real optimal set of integer programming.
             Z* :: The optimal value of the linear programming.
             Obviously we have                               Z* ≤ w(C* ).
             Obviously                              C is vertex cover.
             Z* = ∑ w(v) x(v) ≥                     ∑ w(v) x(v)
                      v∈V                     v∈V , x ( v ) ≥ 0.5

                                          ≥         ∑ w(v)0.5 =∑ w(v)0.5 = 0.5w(C ).
                                                                        v∈C
                                              v∈V , x ( v ) ≥ 0.5

             0.5w(C) ≤ Z* ≤ w(C* ) ⇒ { ρ = 2}

Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
26




  • Hastad proved that there is no Polynomial time algorithm for
    vertex cover that achieves an approximation better that 1.16
    unless P=NP (1997).
  • Now we consider another example.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
27




                  Case Study2: MAXSAT Problem

                  MAXSAT :
                  Given K - CNF Formula with each Clause C j has weight Wj ≥ 0.
                  Find assignment to variables that maximizes :
                    ∑w
                  C j is true
                                j


                  Example :
                  Φ = (x1 + x 2 + x 3 + x 4 )( x1 + x 2 + x 3 + x 4 )(x1 + x 2 + x 3 + x 4 ).
                  C1 = (x1 + x 2 + x 3 + x 4 ) → w 1
                  C 2 = ( x1 + x 2 + x 3 + x 4 ) → w 2
                  C3 = (x1 + x 2 + x 3 + x 4 ) → w 3
                                                     Max      ∑w
                                                            C j is true
                                                                          j




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
28




      • We state the following randomized algorithm for this
        problem. (Johnson 1974).


            1) for variables x1 ,...x n randomly assign
                to 0 or 1 with probability 0.5.
            2) return      ∑w
                         C J istrue
                                      j


            Claim :
                                                          1 −1
            The preceding algorithm has [(1-                k
                                                              )] approximation bound.
                                                          2



Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
29




             Proof :
             Define the following random varibles :
             I j = 1 if clause C j satisfiable if not 0.
                                    ∑ w =∑ w I
                                  C j is true
                                                j
                                                    Cj
                                                          j j


                                                           1
                                   p(I j = 1 ) = (1 -         ).
                                                           2k
                                                         1
                                   p(I j = 0 ) = (          ).
                                                         2k
                                                                     1
             E{    ∑ w j} =E{∑ w jI j} =∑ w jE{I j} =(1-
                 C j is true               Cj              Cj
                                                                       )∑ w j .
                                                                     2k C j
                               1 −1
             ρ = [(1 -           k
                                   )]
                               2



Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
30




         • We can extend Johnson algorithm as follow:


                     1) for variables x1 ,...x n randomly and independently assign
                        to 1 or 0 with probability p i and 1 - p i
                     2) return            ∑w
                                      C J istrue
                                                   j


                     Claim :
                     E{   ∑ w } = ∑ w (1 − ∏ (1 − p )∏ p )
                        C j is true
                                      j
                                               Cj
                                                       j
                                                           if xi
                                                                   i           i
                                                                       if xi

                     The proof is just the same.




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
31




          • Now we model it with IP and relax it.
   Integer Programming Model for MAXSAT :
   Max ∑ w jz j
                                                              Relaxation
          Cj

   ∀C j            ∑ y + ∑ (1- y ) ≥ z
                   if x i
                            i            i         j
                                if x i

                   0 ≤ yi ≤ 1
                   0 ≤ zj ≤1                 yi , z j ∈ Ζ


                                                            Relaxation for MAXSAT :
                                                            Max ∑ w jz j
                                                                   Cj

                                                            ∀C j           ∑ y + ∑ (1- y ) ≥ z
                                                                           if x i
                                                                                    i
                                                                                        if x i
                                                                                                 i        j


                                                                           0 ≤ yi ≤ 1
                                                                           0 ≤ zj ≤1                 yi , z j ∈ ℜ

Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
32




        Algorithm
       1) Solve the LP and find vector (y* , z * ).
        2) use extended Johnson algorithm with p i = y*i .
       3) return
        Lemma :
        for any feasible solution (y* , z* ) to LP and for any clause C j with k Literals,
        we have
                                                                     1
                                  1 - ∏ (1- y i )∏ (y i ) ≥ (1 − (1 − ) k ) z j
                                      if x i     if x i              k
        Claim :
        The above algorithm has (1- 1/e) -1 approximation ratio.



Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
33




        Proof :
        ∑ w {1 − ∏ (1 − p )∏ ( p )} =
            Cj
                 j
                           xi
                                         i             i
                                              xi

                                                    1
        ∑  w j {1 − ∏ (1 − yi )∏ ( yi )} ≥ (1 − (1 − ) k )∑ w j z j =
                             *       *                             *

        Cj          xi         xi                   k Cj
                 1 k        *         1 k       *
        (1 − (1 − ) ) Z LP ≥ (1 − (1 − ) ) Z IP
                 k                    k
        [ρ = (1 − 1 / e) −1       ]

Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
34




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
35




                    •     Algorithm
                    2. Compute the value from Johnson
                       algorithm (w1).
                    3. Compute the value from LP Based
                       algorithm (w2).
                    4. Return MAX(w1,w2).




Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
36




                  Conclusion
     • In this talk the randomized rounding method is explored
       with two example.
     • This new method opens new insight into approximation
       algorithms and complexity theory.


                                            The End



Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems

Weitere ähnliche Inhalte

Was ist angesagt?

Algorithm Design and Complexity - Course 6
Algorithm Design and Complexity - Course 6Algorithm Design and Complexity - Course 6
Algorithm Design and Complexity - Course 6Traian Rebedea
 
20110319 parameterized algorithms_fomin_lecture01-02
20110319 parameterized algorithms_fomin_lecture01-0220110319 parameterized algorithms_fomin_lecture01-02
20110319 parameterized algorithms_fomin_lecture01-02Computer Science Club
 
Elementary Landscape Decomposition of Combinatorial Optimization Problems
Elementary Landscape Decomposition of Combinatorial Optimization ProblemsElementary Landscape Decomposition of Combinatorial Optimization Problems
Elementary Landscape Decomposition of Combinatorial Optimization Problemsjfrchicanog
 
PAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPierre Jacob
 
Spectral Learning Methods for Finite State Machines with Applications to Na...
  Spectral Learning Methods for Finite State Machines with Applications to Na...  Spectral Learning Methods for Finite State Machines with Applications to Na...
Spectral Learning Methods for Finite State Machines with Applications to Na...LARCA UPC
 
Kernelization algorithms for graph and other structure modification problems
Kernelization algorithms for graph and other structure modification problemsKernelization algorithms for graph and other structure modification problems
Kernelization algorithms for graph and other structure modification problemsAnthony Perez
 
Introduction to NP Completeness
Introduction to NP CompletenessIntroduction to NP Completeness
Introduction to NP CompletenessGene Moo Lee
 
Algorithms and Complexity: Cryptography Theory
Algorithms and Complexity: Cryptography TheoryAlgorithms and Complexity: Cryptography Theory
Algorithms and Complexity: Cryptography TheoryAlex Prut
 
Parameter Estimation for Semiparametric Models with CMARS and Its Applications
Parameter Estimation for Semiparametric Models with CMARS and Its ApplicationsParameter Estimation for Semiparametric Models with CMARS and Its Applications
Parameter Estimation for Semiparametric Models with CMARS and Its ApplicationsSSA KPI
 
Asymptotic Notations
Asymptotic NotationsAsymptotic Notations
Asymptotic NotationsNagendraK18
 
Computability - Tractable, Intractable and Non-computable Function
Computability - Tractable, Intractable and Non-computable FunctionComputability - Tractable, Intractable and Non-computable Function
Computability - Tractable, Intractable and Non-computable FunctionReggie Niccolo Santos
 
Tensor Decomposition and its Applications
Tensor Decomposition and its ApplicationsTensor Decomposition and its Applications
Tensor Decomposition and its ApplicationsKeisuke OTAKI
 
Algorithm_NP-Completeness Proof
Algorithm_NP-Completeness ProofAlgorithm_NP-Completeness Proof
Algorithm_NP-Completeness ProofIm Rafid
 
Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Christian Robert
 

Was ist angesagt? (20)

Algorithm Design and Complexity - Course 6
Algorithm Design and Complexity - Course 6Algorithm Design and Complexity - Course 6
Algorithm Design and Complexity - Course 6
 
20110319 parameterized algorithms_fomin_lecture01-02
20110319 parameterized algorithms_fomin_lecture01-0220110319 parameterized algorithms_fomin_lecture01-02
20110319 parameterized algorithms_fomin_lecture01-02
 
Daa notes 3
Daa notes 3Daa notes 3
Daa notes 3
 
NP completeness
NP completenessNP completeness
NP completeness
 
Elementary Landscape Decomposition of Combinatorial Optimization Problems
Elementary Landscape Decomposition of Combinatorial Optimization ProblemsElementary Landscape Decomposition of Combinatorial Optimization Problems
Elementary Landscape Decomposition of Combinatorial Optimization Problems
 
Lecture 8
Lecture 8Lecture 8
Lecture 8
 
PAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ Warwick
 
Spectral Learning Methods for Finite State Machines with Applications to Na...
  Spectral Learning Methods for Finite State Machines with Applications to Na...  Spectral Learning Methods for Finite State Machines with Applications to Na...
Spectral Learning Methods for Finite State Machines with Applications to Na...
 
Kernelization algorithms for graph and other structure modification problems
Kernelization algorithms for graph and other structure modification problemsKernelization algorithms for graph and other structure modification problems
Kernelization algorithms for graph and other structure modification problems
 
Introduction to NP Completeness
Introduction to NP CompletenessIntroduction to NP Completeness
Introduction to NP Completeness
 
Algorithms and Complexity: Cryptography Theory
Algorithms and Complexity: Cryptography TheoryAlgorithms and Complexity: Cryptography Theory
Algorithms and Complexity: Cryptography Theory
 
Parameter Estimation for Semiparametric Models with CMARS and Its Applications
Parameter Estimation for Semiparametric Models with CMARS and Its ApplicationsParameter Estimation for Semiparametric Models with CMARS and Its Applications
Parameter Estimation for Semiparametric Models with CMARS and Its Applications
 
Asymptotic Notations
Asymptotic NotationsAsymptotic Notations
Asymptotic Notations
 
Ca notes
Ca notesCa notes
Ca notes
 
Computability - Tractable, Intractable and Non-computable Function
Computability - Tractable, Intractable and Non-computable FunctionComputability - Tractable, Intractable and Non-computable Function
Computability - Tractable, Intractable and Non-computable Function
 
P versus NP
P versus NPP versus NP
P versus NP
 
Tensor Decomposition and its Applications
Tensor Decomposition and its ApplicationsTensor Decomposition and its Applications
Tensor Decomposition and its Applications
 
Davezies
DaveziesDavezies
Davezies
 
Algorithm_NP-Completeness Proof
Algorithm_NP-Completeness ProofAlgorithm_NP-Completeness Proof
Algorithm_NP-Completeness Proof
 
Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010
 

Ähnlich wie Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems

lecture01_lecture01_lecture0001_ceva.pdf
lecture01_lecture01_lecture0001_ceva.pdflecture01_lecture01_lecture0001_ceva.pdf
lecture01_lecture01_lecture0001_ceva.pdfAnaNeacsu5
 
Global optimization
Global optimizationGlobal optimization
Global optimizationbpenalver
 
Linear Programming (graphical method)
Linear Programming (graphical method)Linear Programming (graphical method)
Linear Programming (graphical method)Kamel Attar
 
NIPS2007: learning using many examples
NIPS2007: learning using many examplesNIPS2007: learning using many examples
NIPS2007: learning using many exampleszukun
 
slides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadslides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadFarhad Gholami
 
DeepLearn2022 1. Goals & AlgorithmDesign.pdf
DeepLearn2022 1. Goals & AlgorithmDesign.pdfDeepLearn2022 1. Goals & AlgorithmDesign.pdf
DeepLearn2022 1. Goals & AlgorithmDesign.pdfSean Meyn
 
Cheatsheet recurrent-neural-networks
Cheatsheet recurrent-neural-networksCheatsheet recurrent-neural-networks
Cheatsheet recurrent-neural-networksSteve Nouri
 
Methods of Manifold Learning for Dimension Reduction of Large Data Sets
Methods of Manifold Learning for Dimension Reduction of Large Data SetsMethods of Manifold Learning for Dimension Reduction of Large Data Sets
Methods of Manifold Learning for Dimension Reduction of Large Data SetsRyan B Harvey, CSDP, CSM
 
Convex optmization in communications
Convex optmization in communicationsConvex optmization in communications
Convex optmization in communicationsDeepshika Reddy
 
Solving Optimization Problems using the Matlab Optimization.docx
Solving Optimization Problems using the Matlab Optimization.docxSolving Optimization Problems using the Matlab Optimization.docx
Solving Optimization Problems using the Matlab Optimization.docxwhitneyleman54422
 
Approximation algorithms
Approximation algorithmsApproximation algorithms
Approximation algorithmsGanesh Solanke
 
Dynamic programming class 16
Dynamic programming class 16Dynamic programming class 16
Dynamic programming class 16Kumar
 
Exact Matrix Completion via Convex Optimization Slide (PPT)
Exact Matrix Completion via Convex Optimization Slide (PPT)Exact Matrix Completion via Convex Optimization Slide (PPT)
Exact Matrix Completion via Convex Optimization Slide (PPT)Joonyoung Yi
 
Elementary Landscape Decomposition of the Hamiltonian Path Optimization Problem
Elementary Landscape Decomposition of the Hamiltonian Path Optimization ProblemElementary Landscape Decomposition of the Hamiltonian Path Optimization Problem
Elementary Landscape Decomposition of the Hamiltonian Path Optimization Problemjfrchicanog
 

Ähnlich wie Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems (20)

lecture01_lecture01_lecture0001_ceva.pdf
lecture01_lecture01_lecture0001_ceva.pdflecture01_lecture01_lecture0001_ceva.pdf
lecture01_lecture01_lecture0001_ceva.pdf
 
Global optimization
Global optimizationGlobal optimization
Global optimization
 
Chap12 slides
Chap12 slidesChap12 slides
Chap12 slides
 
Linear Programming (graphical method)
Linear Programming (graphical method)Linear Programming (graphical method)
Linear Programming (graphical method)
 
NIPS2007: learning using many examples
NIPS2007: learning using many examplesNIPS2007: learning using many examples
NIPS2007: learning using many examples
 
B02402012022
B02402012022B02402012022
B02402012022
 
LP.ppt
LP.pptLP.ppt
LP.ppt
 
slides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadslides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhad
 
DeepLearn2022 1. Goals & AlgorithmDesign.pdf
DeepLearn2022 1. Goals & AlgorithmDesign.pdfDeepLearn2022 1. Goals & AlgorithmDesign.pdf
DeepLearn2022 1. Goals & AlgorithmDesign.pdf
 
Cheatsheet recurrent-neural-networks
Cheatsheet recurrent-neural-networksCheatsheet recurrent-neural-networks
Cheatsheet recurrent-neural-networks
 
Methods of Manifold Learning for Dimension Reduction of Large Data Sets
Methods of Manifold Learning for Dimension Reduction of Large Data SetsMethods of Manifold Learning for Dimension Reduction of Large Data Sets
Methods of Manifold Learning for Dimension Reduction of Large Data Sets
 
Convex optmization in communications
Convex optmization in communicationsConvex optmization in communications
Convex optmization in communications
 
Solving Optimization Problems using the Matlab Optimization.docx
Solving Optimization Problems using the Matlab Optimization.docxSolving Optimization Problems using the Matlab Optimization.docx
Solving Optimization Problems using the Matlab Optimization.docx
 
Approximation algorithms
Approximation algorithmsApproximation algorithms
Approximation algorithms
 
Dynamic programming class 16
Dynamic programming class 16Dynamic programming class 16
Dynamic programming class 16
 
Exact Matrix Completion via Convex Optimization Slide (PPT)
Exact Matrix Completion via Convex Optimization Slide (PPT)Exact Matrix Completion via Convex Optimization Slide (PPT)
Exact Matrix Completion via Convex Optimization Slide (PPT)
 
bv_cvxslides (1).pdf
bv_cvxslides (1).pdfbv_cvxslides (1).pdf
bv_cvxslides (1).pdf
 
DAA.pdf
DAA.pdfDAA.pdf
DAA.pdf
 
DAA.pdf
DAA.pdfDAA.pdf
DAA.pdf
 
Elementary Landscape Decomposition of the Hamiltonian Path Optimization Problem
Elementary Landscape Decomposition of the Hamiltonian Path Optimization ProblemElementary Landscape Decomposition of the Hamiltonian Path Optimization Problem
Elementary Landscape Decomposition of the Hamiltonian Path Optimization Problem
 

Mehr von Reza Rahimi

Boosting Personalization In SaaS Using Machine Learning.pdf
Boosting Personalization  In SaaS Using Machine Learning.pdfBoosting Personalization  In SaaS Using Machine Learning.pdf
Boosting Personalization In SaaS Using Machine Learning.pdfReza Rahimi
 
Self-Tuning and Managing Services
Self-Tuning and Managing ServicesSelf-Tuning and Managing Services
Self-Tuning and Managing ServicesReza Rahimi
 
Low Complexity Secure Code Design for Big Data in Cloud Storage Systems
Low Complexity Secure Code Design for Big Data in Cloud Storage SystemsLow Complexity Secure Code Design for Big Data in Cloud Storage Systems
Low Complexity Secure Code Design for Big Data in Cloud Storage SystemsReza Rahimi
 
Smart Connectivity
Smart ConnectivitySmart Connectivity
Smart ConnectivityReza Rahimi
 
Self-Tuning Data Centers
Self-Tuning Data CentersSelf-Tuning Data Centers
Self-Tuning Data CentersReza Rahimi
 
The Next Big Thing in IT
The Next Big Thing in ITThe Next Big Thing in IT
The Next Big Thing in ITReza Rahimi
 
QoS-Aware Middleware for Optimal Service Allocation in Mobile Cloud Computing
QoS-Aware Middleware for Optimal Service Allocation in Mobile Cloud ComputingQoS-Aware Middleware for Optimal Service Allocation in Mobile Cloud Computing
QoS-Aware Middleware for Optimal Service Allocation in Mobile Cloud ComputingReza Rahimi
 
On Optimal and Fair Service Allocation in Mobile Cloud Computing
On Optimal and Fair Service Allocation in Mobile Cloud ComputingOn Optimal and Fair Service Allocation in Mobile Cloud Computing
On Optimal and Fair Service Allocation in Mobile Cloud ComputingReza Rahimi
 
SMS Spam Filter Design Using R: A Machine Learning Approach
SMS Spam Filter Design Using R: A Machine Learning ApproachSMS Spam Filter Design Using R: A Machine Learning Approach
SMS Spam Filter Design Using R: A Machine Learning ApproachReza Rahimi
 
Mobile Applications on an Elastic and Scalable 2-Tier Cloud Architecture
Mobile Applications on an Elastic and Scalable 2-Tier Cloud ArchitectureMobile Applications on an Elastic and Scalable 2-Tier Cloud Architecture
Mobile Applications on an Elastic and Scalable 2-Tier Cloud ArchitectureReza Rahimi
 
Exploiting an Elastic 2-Tiered Cloud Architecture for Rich Mobile Applications
Exploiting an Elastic 2-Tiered Cloud Architecture for Rich Mobile ApplicationsExploiting an Elastic 2-Tiered Cloud Architecture for Rich Mobile Applications
Exploiting an Elastic 2-Tiered Cloud Architecture for Rich Mobile ApplicationsReza Rahimi
 
Fingerprint High Level Classification
Fingerprint High Level ClassificationFingerprint High Level Classification
Fingerprint High Level ClassificationReza Rahimi
 
Optimizing Multicast Throughput in IP Network
Optimizing Multicast Throughput in IP NetworkOptimizing Multicast Throughput in IP Network
Optimizing Multicast Throughput in IP NetworkReza Rahimi
 
The Case for a Signal Oriented Data Stream Management System
The Case for a Signal Oriented Data Stream Management SystemThe Case for a Signal Oriented Data Stream Management System
The Case for a Signal Oriented Data Stream Management SystemReza Rahimi
 
Mobile Cloud Computing: Big Picture
Mobile Cloud Computing: Big PictureMobile Cloud Computing: Big Picture
Mobile Cloud Computing: Big PictureReza Rahimi
 
Network Information Processing
Network Information ProcessingNetwork Information Processing
Network Information ProcessingReza Rahimi
 
Pervasive Image Computation: A Mobile Phone Application for getting Informat...
Pervasive Image Computation: A Mobile  Phone Application for getting Informat...Pervasive Image Computation: A Mobile  Phone Application for getting Informat...
Pervasive Image Computation: A Mobile Phone Application for getting Informat...Reza Rahimi
 
Gaussian Integration
Gaussian IntegrationGaussian Integration
Gaussian IntegrationReza Rahimi
 
Interactive Proof Systems and An Introduction to PCP
Interactive Proof Systems and An Introduction to PCPInteractive Proof Systems and An Introduction to PCP
Interactive Proof Systems and An Introduction to PCPReza Rahimi
 
Quantum Computation and Algorithms
Quantum Computation and Algorithms Quantum Computation and Algorithms
Quantum Computation and Algorithms Reza Rahimi
 

Mehr von Reza Rahimi (20)

Boosting Personalization In SaaS Using Machine Learning.pdf
Boosting Personalization  In SaaS Using Machine Learning.pdfBoosting Personalization  In SaaS Using Machine Learning.pdf
Boosting Personalization In SaaS Using Machine Learning.pdf
 
Self-Tuning and Managing Services
Self-Tuning and Managing ServicesSelf-Tuning and Managing Services
Self-Tuning and Managing Services
 
Low Complexity Secure Code Design for Big Data in Cloud Storage Systems
Low Complexity Secure Code Design for Big Data in Cloud Storage SystemsLow Complexity Secure Code Design for Big Data in Cloud Storage Systems
Low Complexity Secure Code Design for Big Data in Cloud Storage Systems
 
Smart Connectivity
Smart ConnectivitySmart Connectivity
Smart Connectivity
 
Self-Tuning Data Centers
Self-Tuning Data CentersSelf-Tuning Data Centers
Self-Tuning Data Centers
 
The Next Big Thing in IT
The Next Big Thing in ITThe Next Big Thing in IT
The Next Big Thing in IT
 
QoS-Aware Middleware for Optimal Service Allocation in Mobile Cloud Computing
QoS-Aware Middleware for Optimal Service Allocation in Mobile Cloud ComputingQoS-Aware Middleware for Optimal Service Allocation in Mobile Cloud Computing
QoS-Aware Middleware for Optimal Service Allocation in Mobile Cloud Computing
 
On Optimal and Fair Service Allocation in Mobile Cloud Computing
On Optimal and Fair Service Allocation in Mobile Cloud ComputingOn Optimal and Fair Service Allocation in Mobile Cloud Computing
On Optimal and Fair Service Allocation in Mobile Cloud Computing
 
SMS Spam Filter Design Using R: A Machine Learning Approach
SMS Spam Filter Design Using R: A Machine Learning ApproachSMS Spam Filter Design Using R: A Machine Learning Approach
SMS Spam Filter Design Using R: A Machine Learning Approach
 
Mobile Applications on an Elastic and Scalable 2-Tier Cloud Architecture
Mobile Applications on an Elastic and Scalable 2-Tier Cloud ArchitectureMobile Applications on an Elastic and Scalable 2-Tier Cloud Architecture
Mobile Applications on an Elastic and Scalable 2-Tier Cloud Architecture
 
Exploiting an Elastic 2-Tiered Cloud Architecture for Rich Mobile Applications
Exploiting an Elastic 2-Tiered Cloud Architecture for Rich Mobile ApplicationsExploiting an Elastic 2-Tiered Cloud Architecture for Rich Mobile Applications
Exploiting an Elastic 2-Tiered Cloud Architecture for Rich Mobile Applications
 
Fingerprint High Level Classification
Fingerprint High Level ClassificationFingerprint High Level Classification
Fingerprint High Level Classification
 
Optimizing Multicast Throughput in IP Network
Optimizing Multicast Throughput in IP NetworkOptimizing Multicast Throughput in IP Network
Optimizing Multicast Throughput in IP Network
 
The Case for a Signal Oriented Data Stream Management System
The Case for a Signal Oriented Data Stream Management SystemThe Case for a Signal Oriented Data Stream Management System
The Case for a Signal Oriented Data Stream Management System
 
Mobile Cloud Computing: Big Picture
Mobile Cloud Computing: Big PictureMobile Cloud Computing: Big Picture
Mobile Cloud Computing: Big Picture
 
Network Information Processing
Network Information ProcessingNetwork Information Processing
Network Information Processing
 
Pervasive Image Computation: A Mobile Phone Application for getting Informat...
Pervasive Image Computation: A Mobile  Phone Application for getting Informat...Pervasive Image Computation: A Mobile  Phone Application for getting Informat...
Pervasive Image Computation: A Mobile Phone Application for getting Informat...
 
Gaussian Integration
Gaussian IntegrationGaussian Integration
Gaussian Integration
 
Interactive Proof Systems and An Introduction to PCP
Interactive Proof Systems and An Introduction to PCPInteractive Proof Systems and An Introduction to PCP
Interactive Proof Systems and An Introduction to PCP
 
Quantum Computation and Algorithms
Quantum Computation and Algorithms Quantum Computation and Algorithms
Quantum Computation and Algorithms
 

Kürzlich hochgeladen

Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 

Kürzlich hochgeladen (20)

Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 

Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems

  • 1. 1 Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems M. Reza Rahimi, November 2005.
  • 2. 2 Outline • Introduction • Linear Programming Overview • NP Complexity Class • Approximation Algorithms • Case Study1: Minimum Weight Vertex Cover • Case Study2: MAXSAT Problem • Conclusion Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 3. 3 Introduction • One of the most challenging problem in complexity theory is problem P vs. NP. NP • Research on this open problem leads researchers to think of new methods and different approaches. • For example PCP, IP, approximation algorithms,... • For some problems it is proved that there does not exist any suitable approximation unless P=NP. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 4. 4 • So it seems that research on approximation algorithms may lead us to good results about P vs. NP. • General Method for Approximation Algorithms of NP Hard Optimization is Greedy Method. • But we must search for Suitable Framework for studying NP problems. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 5. 5 • Linear Programming was this brake through method. • In this talk I focus on LP and its usage in NP Hard Optimization Problems. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 6. 6 Linear Programming Overview • General formulation of linear programming is presented as follow: Maximize C1X1 + C2X2 + … + CdXd Subject to A1,1X1 + … + A1,dXd ≤ b1 A2,1X1 + … + A2,dXd ≤ b2 … An,1X1 + … + An,dXd ≤ bn All calculation and Numbers are on Real Numbers. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 7. 7 • There are several Polynomial Algorithms for this problem such as Ellipsoid and Interior point method. method • I neglect talking about them. • For these algorithms please refer to optimization references. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 8. 8 NP Complexity Class • NP Complexity Class is only related to Decision Problem. • For example: SAT = {< Φ >: Φ is a satisfying assignment}. • So for any optimization problem we consider its related decision problem. • This will give us intuition about its hardness. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 9. 9 • We know the following definition about NP: L ∈ NP ⇔ ∃V(.,.) ∈ P, ∃P(.), ∀x ∈ Σ∗ , 1. x ∈ L ⇒ ∃y, y ≤ P( x ) and V(x, y) accepts. 2. x ∉ L ⇒ ∀y, y ≤ P( x ) and V(x, y) rejects. • We can look at this process like this: Y Find Certificate Y y V(X,Y) X x X x Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 10. 10 • Example: SAT = {< Φ >: Φ is a satisfying assignment}. SAT ∈ NP Because : 1) if Φ 0 ∈ SAT ⇒ We are given True assignment , and could check it in polynomial time. 2)if Φ 0 ∉ SAT ⇒ Then there is no true assignment. • There is one another important concept in complexity which is Reduction. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 11. 11 Definition : L ≤ P L* ⇔ ∃f (Polynomial Time Function) ∀x ∈ L ⇔ f ( x) ∈ L* . f L L* f L L L* Fig1: Graphical Diagram of Reduction Concept Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 12. 12 Example : SAT = {< Φ >| Φ is satisfiable assignment}. 3 - CNF = {< Φ >| Φ is 3 - cnf and satisfiable}. we show that : SAT ≤ P 3 − CNF . (3 − CNF is in this form for example : (x1 ∨ x 2 ∨ x 3 ) ∧ ( x1 ∨ x 2 ∨ x 3 ) ∧ (x1 ∨ x 2 ∨ x 3 ) Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 13. 13 Φ ≡ (¬x1 → x2 ) ↔ ( x3 ∧ x2 ) <-> y1 y2 -> ^ -x1 x2 x3 x4 Φ ≡ ( y1 ↔ (¬x1 → x2 )) ∧ (( x3 ∧ x4 ) ↔ y2 ) ∧ ( y1 ↔ y2 ) Now each paranthesis can be converted into 3 - OR form using truth table. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 14. 14 Example : INTEGER − PROGRAMMING = {< Am×n , bm×1 >| Am×n and bm×1 are integer matrices, ∃X n×1 ∈ Ζ n , Am×n X n×1 ≤ bm×1}. Obviously INTEGER − PROGRAMMING ∈ NP. We will show that : 3 − CNF ≤ P INTEGER − PROGRAMMING. Proof : ¬X 1 ∨ X 2 ∨ X 3 ↔ (1 − X 1 ) + X 2 + X 3 ¬X 1 ∨ X 2 ∨ X 3 ≡ 1 ↔ (1 − X 1 ) + X 2 + X 3 ≥ 1 X i ∈ Ζ, 0 ≤ X i ≤ 1 Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 15. 15 • We conclude this section by the following theorem: Every language in NP is polynomial time reducible to SAT language. • Language like SAT, INTEGER-PROGRAMMING, Hamiltonian Cycle, and 3-CNF are also the same and they belong to NP-Complete set. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 16. 16 Approximation Algorithms • The following describes the connection of NP optimization Complete problem and its related optimization problem. Decision Version : TSP = {< Gn×n , b >| G n×n is complete non negative weighted graph and there exists Hamiltonian Cycle such that its cost is less than or equal b}. Optimization Version : OP − TSP : we have complete non negative weighted graph G n×n , find the Hamiltonian Cycle which its cost is minimun. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 17. 17 • Bound of Approximation: • Suppose that problem P has an optimum solution with cost C- op. Algorithm X solves problem P with bound ρ(n) if it finds feasible solution with cost C such as: max {c/c-op, c-op/c} ≤ ρ(n). • Note that with this definition we always have ρ(n) ≥1. • In the following I try to show that sometimes, finding good approximation for optimization problems is very hard. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 18. 18 • Theorem: It does not exist any polynomial time approximation algorithm with Polynomial Bound for OPT-TSP unless P=NP. • Proof: we prove that if such an algorithm exists then we can Solve Hamiltonian Cycle in Polynomial Time. For each graph G with n vertex convert It into weighted complete graph G1 as G has Hamiltonian Cycle the following procedure: Iff 4) Assign 1 to each of its edge. OPT-TSP(G1)=n 5) Assign nρ(n) to the other edges. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 19. 19 • I think that it’s now time to study Randomized Rounding technique for solving optimization problems. • I explore it with two examples. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 20. 20 Case Study1: Minimum Weight Vertex Cover • Now we study randomized rounding method in approximation algorithms with an example. • Definition: • Vertex Cover: In undirected graph G=(V,E), Set A of vertices is said vertex cover if for every (u,v) ε E then u ε A or v ε A . • Minimum Weight Vertex Cover: In undirected graph G=(V,E) which each vertex has w(v) as positive weight Set A of vertices is said minimum weight vertex cover if for every (u,v) ε E then u ε A or v ε A and w(A) is the least. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 21. 21 Fig2: Vertex Cover Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 22. 22 • It is proved that decision version of this optimization problem is NP Hard. • At the first step we model it as integer programming. x : V → {0,1} if v ∈ Minimum Weight Vertex Cover Set ⇒ x(v) = 1. else x(v) = 0. Integer Programming Method : min ∑ w(v)x(v) v∈V ∀u , v ∈ V x(v) + x (u ) ≥ 1 0 ≤ x (v ) ≤ 1 Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 23. 23 • As it was known it is believed that there is no polynomial time algorithm for integer programming. • We must think about another method. • We investigate its related linear programming. x : V → [0,1] Linear Programming Method : min ∑ w(v)x(v) v∈V ∀u , v ∈ V x(v) + x(u ) ≥ 1 0 ≤ x (v ) ≤ 1 Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 24. 24 • The Last method is called Relaxation to linear programming. programming • It’s now time to state the approximation algorithm. Approximation Min Weight Vertex Cover(G, w) 1) C ← Φ 2) Compute x an optimal solution to linear programming. 3) for each v ∈ V 4) do if x (v) ≥ 0.5 / * Rounding Method * / 5) then C ← C ∪ {v} 6) Return C. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 25. 25 Proof : C :: The soulution of algorithm. C* :: The real optimal set of integer programming. Z* :: The optimal value of the linear programming. Obviously we have Z* ≤ w(C* ). Obviously C is vertex cover. Z* = ∑ w(v) x(v) ≥ ∑ w(v) x(v) v∈V v∈V , x ( v ) ≥ 0.5 ≥ ∑ w(v)0.5 =∑ w(v)0.5 = 0.5w(C ). v∈C v∈V , x ( v ) ≥ 0.5 0.5w(C) ≤ Z* ≤ w(C* ) ⇒ { ρ = 2} Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 26. 26 • Hastad proved that there is no Polynomial time algorithm for vertex cover that achieves an approximation better that 1.16 unless P=NP (1997). • Now we consider another example. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 27. 27 Case Study2: MAXSAT Problem MAXSAT : Given K - CNF Formula with each Clause C j has weight Wj ≥ 0. Find assignment to variables that maximizes : ∑w C j is true j Example : Φ = (x1 + x 2 + x 3 + x 4 )( x1 + x 2 + x 3 + x 4 )(x1 + x 2 + x 3 + x 4 ). C1 = (x1 + x 2 + x 3 + x 4 ) → w 1 C 2 = ( x1 + x 2 + x 3 + x 4 ) → w 2 C3 = (x1 + x 2 + x 3 + x 4 ) → w 3 Max ∑w C j is true j Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 28. 28 • We state the following randomized algorithm for this problem. (Johnson 1974). 1) for variables x1 ,...x n randomly assign to 0 or 1 with probability 0.5. 2) return ∑w C J istrue j Claim : 1 −1 The preceding algorithm has [(1- k )] approximation bound. 2 Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 29. 29 Proof : Define the following random varibles : I j = 1 if clause C j satisfiable if not 0. ∑ w =∑ w I C j is true j Cj j j 1 p(I j = 1 ) = (1 - ). 2k 1 p(I j = 0 ) = ( ). 2k 1 E{ ∑ w j} =E{∑ w jI j} =∑ w jE{I j} =(1- C j is true Cj Cj )∑ w j . 2k C j 1 −1 ρ = [(1 - k )] 2 Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 30. 30 • We can extend Johnson algorithm as follow: 1) for variables x1 ,...x n randomly and independently assign to 1 or 0 with probability p i and 1 - p i 2) return ∑w C J istrue j Claim : E{ ∑ w } = ∑ w (1 − ∏ (1 − p )∏ p ) C j is true j Cj j if xi i i if xi The proof is just the same. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 31. 31 • Now we model it with IP and relax it. Integer Programming Model for MAXSAT : Max ∑ w jz j Relaxation Cj ∀C j ∑ y + ∑ (1- y ) ≥ z if x i i i j if x i 0 ≤ yi ≤ 1 0 ≤ zj ≤1 yi , z j ∈ Ζ Relaxation for MAXSAT : Max ∑ w jz j Cj ∀C j ∑ y + ∑ (1- y ) ≥ z if x i i if x i i j 0 ≤ yi ≤ 1 0 ≤ zj ≤1 yi , z j ∈ ℜ Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 32. 32 Algorithm 1) Solve the LP and find vector (y* , z * ). 2) use extended Johnson algorithm with p i = y*i . 3) return Lemma : for any feasible solution (y* , z* ) to LP and for any clause C j with k Literals, we have 1 1 - ∏ (1- y i )∏ (y i ) ≥ (1 − (1 − ) k ) z j if x i if x i k Claim : The above algorithm has (1- 1/e) -1 approximation ratio. Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 33. 33 Proof : ∑ w {1 − ∏ (1 − p )∏ ( p )} = Cj j xi i i xi 1 ∑ w j {1 − ∏ (1 − yi )∏ ( yi )} ≥ (1 − (1 − ) k )∑ w j z j = * * * Cj xi xi k Cj 1 k * 1 k * (1 − (1 − ) ) Z LP ≥ (1 − (1 − ) ) Z IP k k [ρ = (1 − 1 / e) −1 ] Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 34. 34 Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 35. 35 • Algorithm 2. Compute the value from Johnson algorithm (w1). 3. Compute the value from LP Based algorithm (w2). 4. Return MAX(w1,w2). Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems
  • 36. 36 Conclusion • In this talk the randomized rounding method is explored with two example. • This new method opens new insight into approximation algorithms and complexity theory. The End Linear Programming and its Usage in Approximation Algorithms for NP Hard Optimization Problems