SlideShare ist ein Scribd-Unternehmen logo
1 von 34
•          k-NN
         •             Yes, No


Training Data




 Test Data



                                 3
4
•                         xi                       yi      i
              1   -1
    (xi , yi )(i = 1, . . . , l, xi ∈ Rn , yi ∈ {1, −1})

•                  w, b
    yi (w · (xi − b)) > 0 (i = 0, . . . , l)




                                                           5
•                w.x+b≧0
•                    1,
                           x     d(x)
                         if w · x + b ≥ 0
        d(x) =
                     −1, otherwise
•
    •




                                            6
Fisher                  (1)
•                               2    2


    •               w   w+b=0
    •   w   x+b=0




                                          7
Fisher                                 (2)
•                        m+ m-
                       d(x)=1   x                   d(x)=−1   x
         m+ =                       , m− =
                   |{x|d(x) = 1}|              |{x|d(x) = −1}|

•                         |(m+ − m− ) · w|
    •
•           w.x+b=0
                                2                             2
                 ((x − m+ ) · w) +             ((x − m− ) · w)
        d(x)=1                       d(x)=−1


    •

                                                                  8
Fisher                                   (3)
•
•              |w|=1                 J(w)                    w
    •                        w.x+b
    •   b
                                                   2
                            |(m+ − m− ) · w|
    J(w) =                           2                             2
             d(x)=1 ((x − m+ ) · w) +        d(x)=−1 ((x − m− ) · w)


                              J(w)             w
                              J(w)       w              0




                                                                       9
Fisher                    (4)
 J(w)
          w T SB w
J(w)    =
          w T SW w
          SB = (m+ − m− )(m+ − m− )T
          SW =      (x − m+ )(x − m+ )T +             (x − m− )(x − m− )T
                       d(x)=1               d(x)=−1
            ∂J(w)
          0       =0
             ∂w
          f           f g − fg
                  =
          g               g2

(wT SB w)SW w = (wT SW w)SB w
              2                        SB w      m+ − m−
 w ∝ S−1 (m+ − m− )
      W


                      Sw                                              10
SVM (Support Vector Machine)
•
    •
•




                               11
•       ρ(w,b)
                         xi · w              xi · w
    ρ(w, b) =   min             − max
              {xi |yi =1} |w|    {xi |yi =−1} |w|



                                                      12
2




w0 · x + b0 = ±1                 w0, b0
                                      xi · w0               xi · w0
            ρ(w0 , b0 ) =   min                − max
                          {xi |yi =1} |w0 |     {xi |yi =−1} |w0 |
                          1 − b0       −1 − b0       2
                        =           −           =
                           |w0 |         |w0 |    |w0 |
                                                                      13
•   2/|w0 |               w0 · w0


              yi (w0 · xi + b) ≥ 1 (i = 1, . . . , l)

                        w0 · w0                  w0

•               2                 2
    •                                            2
    •               1
• 2

    •                                                   14
(1)
    yi (w0 · xi + b) ≥ 1 (i = 1, . . . , l)                (1)

            w0 · w0                    w0
       Λ = (α1 , . . . , αl ) (αi ≥ 0)
                                       l
                            |w|2
       L(w, b, Λ)      =         −     αi (yi (xi · w + b) − 1)
                             2     i=1

•                             w, b                  Λ




                                                            15
(2)
•   w=w0, b=b0                      L(w, b, Λ)
                                                                    l
              ∂L(w, b, Λ)
                                            =           w0 −             αi yi xi = 0
                 ∂w                 w=w0
                                                              l
                                                                   i=1                  (2)
                 ∂L(w, b, Λ)
                                            =           −          αi yi = 0
                    ∂b               b=b0                    i=1
                           l                             l
                 w0 =           αi yi xi    ,                 αi yi = 0
                          i=1                           i=1

•                         w=w0, b=b0
                                                    l
                               1
       L(w0 , b0 , Λ) =          w0 · w0 −     αi [yi (xi · w0 + b0 ) − 1]
                               2           i=1
                                l               l        l
                                        1
                     =             αi −                       αi αj yi yj xi · xj
                               i=1
                                        2   i=1 j=1

•                    w         b
                     Λ                                                                        16
SVM
•           l
                            w, b
                αi yi = 0, αi ≥ 0
        i=1                                                              (3)
                             l            l   l
                                     1
      L(w0 , b0 , Λ) =          αi −               αi αj yi yj xi · xj
                            i=1
                                     2   i=1 j=1
                   Λ
• SVM
• w0                Λ
                        l
  •    (2)     ( w0 =   i=1 αi yi xi )
•   (2)    αi≠0       xi w             KKKT


    • KKT        : αi [yi (xi · w0 + b0 ) − 1] = 0
                                                                          17
•
•
•




    (A)   (B)   18
(        )
•
    •
    •
•
    •                            l               l     l
                                      1
        L(w0 , b0 , Λ) =         αi −                      αi αj yi yj xi · xj
                             i=1
                                      2         i=1 j=1

    •    x
                         l
                                 Φ(x)
                                            l   l
                                  1
        L(w0 , b0 , Λ) =     αi −                    αi αj yi yj Φ(xi ) · Φ(xj )
                         i=1
                                  2        i=1 j=1

    •                                 l
        Φ(x) · w0 + b0       =             αi yi Φ(x) · Φ(xi ) + b0 = 0
                                     i=1

    •            Φ
                                                                                   19
Kernel
•            K(x, y) = Φ(x)
                              √
                                Φ(y)
                                           √     √
•       Φ((x1 , x2 )) = (x1 , 2x1 x2 , x2 , 2x1 , 2x2 , 1)
                           2             2

        Φ((x1 , x2 )) · Φ((y1 , y2 ))
          = (x1 y1 )2 + 2x1 y1 x2 y2 + (x2 y2 )2 + 2x1 y1 + 2x2 y2 + 1
          = (x1 y1 + x2 y2 + 1)2
          = ((x1 , x2 ) · (y1 , y2 ) + 1)2
    •                 (6     )
•
    •                         (x · y + 1)d ,
    •     RBF                 exp(−||x − y||2 /2σ 2 ),
    •                         tanh(κx · y − δ)
         •   σ κ   δ
         •                                                     Mercer
                                                                         20
•

•
    •
        •   ξ

yi (w · xi + b) ≥ 1 − ξi
  where ξi ≥ 0 (i = 1, . . . , l)

                      l
    1
      w·w+C                ξi
    2                i=1

                                    21
(1)
         •
                               Λ = (α1 , . . . , αl ), R = (r1 , . . . , rl )
                   L
  L(w, ξ, b, Λ, R)
                         l            l                                            l
         1
     =     w·w+C              ξi −          αi [yi (xi · w + b) − 1 + ξi ] −            ri ξi
         2              i=1          i=1                                          i=1

w0 , b0 , ξi L
           0
                               w, b, ξi                                 KKT
                                                            l
             ∂L(w, ξ, b, Λ, R)
                                              = w0 −             α i y i xi = 0
                   ∂w            w=w0                      i=0
                                                      l
              ∂L(w, ξ, b, Λ, R)
                                              = −          αi yi = 0
                     ∂b              b=b0            i=0
              ∂L(w, ξ, b, Λ, R)
                                              = C − αi − ri = 0
                    ∂ξi                 0
                                     ξ=ξi                                                       22
(2)
•                                         l
                                             L
                                              1
                                                     l    l
        L(w, ξ, b, Λ, R) =               αi −                 αi αj yi yj xi · xj
                                              2
•
                                     i=1            i=1 j=1
                                                                     C     ξ
                                              SVM
    •                           αi             C
    •   C
•   C - αi - ri = 0        ri                                    0≦αi≦C

                  l
                                    w,b
                      αi yi = 0, 0 ≤ αi ≤ C
                i=1

                                      l              l    l
                                              1
        L(w, ξ, b, Λ, R)        =        αi −                 αi αj yi yj xi · xj
                                     i=1
                                              2     i=1 j=1
                      Λ                                                             23
: Karush-Kuhn-Tucker                     (KKT               )
•
•              gi(x) ≦ 0 (x = (x1, x2, ..., xn))                f(x)


•   KKT     :
                 m
      ∂f (x)           ∂gi (x)
              +     λi         = 0, j = 1, 2, ..., n
        ∂xj     i=1
                        ∂xj
       λi gi (x) = 0, λi ≥ 0, gi (x) ≤ 0, i = 1, 2, ..., m


•   f(x)   gi(x)                                   x, λ   KKT
                          f(x)




                                                                           24
SMO (Sequence Minimal Optimization)
 •   SVM
 •                     Λ=(α1, α2, ...,αl)
 •   αi
     •    6000                       6000
     •
 •               2    (αi, αj)
          2
     •    2      αi
 •               SMO


 •                           LD
                                            l        l   l
                                                1
 LD = L(w, ξ, b, Λ, R) =                   αi −               αi αj yi yj xi · xj
                                       i=1
                                                2   i=1 j=1
                                                                                    25
2                                       (1)
•   α 1 , α2                          LD
•                old  old
               α 1 , α2                        new  new
                                             α 1 , α2

                   Ei ≡ wold · xi + bold − yi
                    old

                      η ≡ 2K12 − K11 − K22 , where Kij = xi · xj
                                 α2
                                   y2 (E1 − E2 )
                                        old  old
                    new
                   α2       = α2 −
                               old
                                           η
                   l
                   i=1   αi y i = 0    γ ≡ α1 + sα2 = Const.
    LD              LD’=0


               η   =      2K12 − K11 − K22 = − | x2 − x1 |2 ≤ 0    26
2                                     (2)
• α 1 , α2       γ ≡ α1 + sα2 = Const.
•                                                 new  new
                                                α 1 , α2     0
             C
  •                                        α2
                                                  clipped
                                                 α2




  (A)                                    (B)                 27
2                               (3)
y1 = y1 (s = 1)
          L = max(0, α1 + α2 − C),
                      old  old
                                     H = min(C, α1 + α2 )
                                                 old  old


y1 = y2 (s = −1)
         L = max(0, α2 − α1 ),
                     old  old
                                  H = min(C, C + α2 − α1 )
                                                  old  old

                L ≤ α2 ≤ H
           s γ

 clipped
α2
                        
                         H,     if α2 ≥ H
                                     new
            clipped
           α2         =    new
                          α2 ,   if L < α2 < H
                                         new
                        
                          L,     if α2 ≤ L
                                     new


           LD

                                                             28
•         L ≤ α2 ≤ H




    (A)          (B)




    (C)          (D)
•    clipped
          α2

                       (B)




                     (C)




(A)


                       (D)
: (α1 , α2 )
    new  new

         clipped
: (α1 , α2
    new
                 )
2
 1. η = 2K12 − K11 − K22
 2. η < 0                                α
                                old old
                           y2 (E2 −E1 )
      (a) α2 = α2 +
           new  old
                                  η
         clipped
    (b) α2
                       clipped
    (c) α1 = α1 − s(α2
         new     old
                               − α2 )
                                  old

 3. η = 0        LD α2 1                          L   H
                                         α1           2(c)
 4.                                      α1,2
      • bnew     E new = 0
                                    clipped
wnew   = wold + (α1 − α1 )y1 x1 + (α2
                  new     old
                                            − α2 )y2 x2
                                               old


E new (x, y) = E old (x, y) + y1 (α1 − α1 )x1 · x
                                    new   old
                       clipped
               +y2 (α2         − α2 )x2 · x − bold + bnew
                                   old

                                                        clipped
bnew = bold − E old (x, y) − y1 (α1 − α1 )x1 · x − y2 (α2
                                  new  old
                                                                − α2 )x2 · x
                                                                   old
                                                                          31
αi
•                         α1 α2
•   α1
    •                   KKT                  KKT


    •
    •   2
        •    0 < αi < C
        •
•   α2
    •   LD
    •
              |E1-E2|
        •    E1               E2        E1         32
SMO                         SVM
•
•
    •                                   α≠0
•                     α 2
    •   2
•   2       α
    •           |E2-E1|
•                   LD            KKT




                                              33
•             3                    (                  )
    •   A                 B                               2


•
    •             (regression problem)
    •   0   100                   0      10, 10 20,


•             1
    •   Web
                         100                100
               Web
        •
    •   One Class SVM                                         34

Weitere ähnliche Inhalte

Was ist angesagt?

One way to see higher dimensional surface
One way to see higher dimensional surfaceOne way to see higher dimensional surface
One way to see higher dimensional surfaceKenta Oono
 
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553Destiny Nooppynuchy
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionGabriel Peyré
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Gabriel Peyré
 
Emat 213 midterm 2 fall 2005
Emat 213 midterm 2 fall 2005Emat 213 midterm 2 fall 2005
Emat 213 midterm 2 fall 2005akabaka12
 
Signal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationSignal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationGabriel Peyré
 
Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2ybenjo
 
修士論文発表会
修士論文発表会修士論文発表会
修士論文発表会Keikusl
 
分かりやすいパターン認識第8章 学習アルゴリズムの一般化
分かりやすいパターン認識第8章 学習アルゴリズムの一般化分かりやすいパターン認識第8章 学習アルゴリズムの一般化
分かりやすいパターン認識第8章 学習アルゴリズムの一般化Yohei Sato
 
Iceaa07 Foils
Iceaa07 FoilsIceaa07 Foils
Iceaa07 FoilsAntonini
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalStéphane Canu
 
X2 T06 02 Cylindrical Shells
X2 T06 02 Cylindrical ShellsX2 T06 02 Cylindrical Shells
X2 T06 02 Cylindrical ShellsNigel Simmons
 
集合知プログラミングゼミ第1回
集合知プログラミングゼミ第1回集合知プログラミングゼミ第1回
集合知プログラミングゼミ第1回Shunta Saito
 
Solution 3 3
Solution 3 3Solution 3 3
Solution 3 3usepnuh
 
Nonlinear Filtering and Path Integral Method (Paper Review)
Nonlinear Filtering and Path Integral Method (Paper Review)Nonlinear Filtering and Path Integral Method (Paper Review)
Nonlinear Filtering and Path Integral Method (Paper Review)Kohta Ishikawa
 

Was ist angesagt? (20)

One way to see higher dimensional surface
One way to see higher dimensional surfaceOne way to see higher dimensional surface
One way to see higher dimensional surface
 
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
 
Digital fiiter
Digital fiiterDigital fiiter
Digital fiiter
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : Multiresolution
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
 
Emat 213 midterm 2 fall 2005
Emat 213 midterm 2 fall 2005Emat 213 midterm 2 fall 2005
Emat 213 midterm 2 fall 2005
 
Signal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationSignal Processing Course : Convex Optimization
Signal Processing Course : Convex Optimization
 
Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2
 
修士論文発表会
修士論文発表会修士論文発表会
修士論文発表会
 
分かりやすいパターン認識第8章 学習アルゴリズムの一般化
分かりやすいパターン認識第8章 学習アルゴリズムの一般化分かりやすいパターン認識第8章 学習アルゴリズムの一般化
分かりやすいパターン認識第8章 学習アルゴリズムの一般化
 
Iceaa07 Foils
Iceaa07 FoilsIceaa07 Foils
Iceaa07 Foils
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primal
 
Ism et chapter_12
Ism et chapter_12Ism et chapter_12
Ism et chapter_12
 
X2 T06 02 Cylindrical Shells
X2 T06 02 Cylindrical ShellsX2 T06 02 Cylindrical Shells
X2 T06 02 Cylindrical Shells
 
集合知プログラミングゼミ第1回
集合知プログラミングゼミ第1回集合知プログラミングゼミ第1回
集合知プログラミングゼミ第1回
 
03 finding roots
03 finding roots03 finding roots
03 finding roots
 
Dmss2011 public
Dmss2011 publicDmss2011 public
Dmss2011 public
 
Solution 3 3
Solution 3 3Solution 3 3
Solution 3 3
 
Nonlinear Filtering and Path Integral Method (Paper Review)
Nonlinear Filtering and Path Integral Method (Paper Review)Nonlinear Filtering and Path Integral Method (Paper Review)
Nonlinear Filtering and Path Integral Method (Paper Review)
 
Inse
InseInse
Inse
 

Andere mochten auch (20)

Ficha 3 bsc exploradores
Ficha 3 bsc exploradoresFicha 3 bsc exploradores
Ficha 3 bsc exploradores
 
Bd T Eq6 Cuadro De Coparacion
Bd T Eq6 Cuadro De CoparacionBd T Eq6 Cuadro De Coparacion
Bd T Eq6 Cuadro De Coparacion
 
Iss
IssIss
Iss
 
iPod PP
iPod PPiPod PP
iPod PP
 
Electrónica digital
Electrónica digitalElectrónica digital
Electrónica digital
 
Terzomodulo
TerzomoduloTerzomodulo
Terzomodulo
 
Projektowanie instalacji z miedzi - podstawowe zalecenia
Projektowanie instalacji z miedzi - podstawowe zaleceniaProjektowanie instalacji z miedzi - podstawowe zalecenia
Projektowanie instalacji z miedzi - podstawowe zalecenia
 
Copia de unidad 8
Copia de unidad 8Copia de unidad 8
Copia de unidad 8
 
Iturrama
IturramaIturrama
Iturrama
 
Te deseo
Te deseoTe deseo
Te deseo
 
2ºtrabajomireia
2ºtrabajomireia2ºtrabajomireia
2ºtrabajomireia
 
123
123123
123
 
Canarias Excelencia Tecnológica
Canarias Excelencia TecnológicaCanarias Excelencia Tecnológica
Canarias Excelencia Tecnológica
 
Millinet Səsvermə
Millinet SəsverməMillinet Səsvermə
Millinet Səsvermə
 
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotorLévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
 
Livro nacional de_regras_2015_
Livro nacional de_regras_2015_Livro nacional de_regras_2015_
Livro nacional de_regras_2015_
 
Gorbeia Central Park
Gorbeia Central ParkGorbeia Central Park
Gorbeia Central Park
 
Niños del mundo!!!
Niños del mundo!!!Niños del mundo!!!
Niños del mundo!!!
 
Tokyo Cabinet
Tokyo CabinetTokyo Cabinet
Tokyo Cabinet
 
Josh, juan capó nando
Josh, juan capó nandoJosh, juan capó nando
Josh, juan capó nando
 

Ähnlich wie Datamining 6th Svm

Datamining 7th Kmeans
Datamining 7th KmeansDatamining 7th Kmeans
Datamining 7th Kmeanssesejun
 
Datamining 7th kmeans
Datamining 7th kmeansDatamining 7th kmeans
Datamining 7th kmeanssesejun
 
Preserving Personalized Pagerank in Subgraphs(ICML 2011)
Preserving Personalized Pagerank in Subgraphs(ICML 2011) Preserving Personalized Pagerank in Subgraphs(ICML 2011)
Preserving Personalized Pagerank in Subgraphs(ICML 2011) ybenjo
 
Special second order non symmetric fitted method for singular
Special second order non symmetric fitted method for singularSpecial second order non symmetric fitted method for singular
Special second order non symmetric fitted method for singularAlexander Decker
 
Absolute value function
Absolute value functionAbsolute value function
Absolute value functiongindar
 
S101-52國立新化高中(代理)
S101-52國立新化高中(代理)S101-52國立新化高中(代理)
S101-52國立新化高中(代理)yustar1026
 
Week 7 [compatibility mode]
Week 7 [compatibility mode]Week 7 [compatibility mode]
Week 7 [compatibility mode]Hazrul156
 

Ähnlich wie Datamining 6th Svm (20)

Datamining 7th Kmeans
Datamining 7th KmeansDatamining 7th Kmeans
Datamining 7th Kmeans
 
Datamining 7th kmeans
Datamining 7th kmeansDatamining 7th kmeans
Datamining 7th kmeans
 
Complex varible
Complex varibleComplex varible
Complex varible
 
Complex varible
Complex varibleComplex varible
Complex varible
 
Ch33 11
Ch33 11Ch33 11
Ch33 11
 
Alg2 lesson 6-6
Alg2 lesson 6-6Alg2 lesson 6-6
Alg2 lesson 6-6
 
Optimal Finite Difference Grids
Optimal Finite Difference GridsOptimal Finite Difference Grids
Optimal Finite Difference Grids
 
Preserving Personalized Pagerank in Subgraphs(ICML 2011)
Preserving Personalized Pagerank in Subgraphs(ICML 2011) Preserving Personalized Pagerank in Subgraphs(ICML 2011)
Preserving Personalized Pagerank in Subgraphs(ICML 2011)
 
In to el
In to elIn to el
In to el
 
Tut 1
Tut 1Tut 1
Tut 1
 
calculo vectorial
calculo vectorialcalculo vectorial
calculo vectorial
 
Special second order non symmetric fitted method for singular
Special second order non symmetric fitted method for singularSpecial second order non symmetric fitted method for singular
Special second order non symmetric fitted method for singular
 
2º mat emática
2º mat emática2º mat emática
2º mat emática
 
Absolute value function
Absolute value functionAbsolute value function
Absolute value function
 
Simultaneous eqn2
Simultaneous eqn2Simultaneous eqn2
Simultaneous eqn2
 
Cs 601
Cs 601Cs 601
Cs 601
 
Chapter4 tf
Chapter4 tfChapter4 tf
Chapter4 tf
 
S101-52國立新化高中(代理)
S101-52國立新化高中(代理)S101-52國立新化高中(代理)
S101-52國立新化高中(代理)
 
Ch02 31
Ch02 31Ch02 31
Ch02 31
 
Week 7 [compatibility mode]
Week 7 [compatibility mode]Week 7 [compatibility mode]
Week 7 [compatibility mode]
 

Mehr von sesejun

RNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A ReviewRNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A Reviewsesejun
 
バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析sesejun
 
次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習sesejun
 
20110602labseminar pub
20110602labseminar pub20110602labseminar pub
20110602labseminar pubsesejun
 
20110524zurichngs 2nd pub
20110524zurichngs 2nd pub20110524zurichngs 2nd pub
20110524zurichngs 2nd pubsesejun
 
20110524zurichngs 1st pub
20110524zurichngs 1st pub20110524zurichngs 1st pub
20110524zurichngs 1st pubsesejun
 
20110214nips2010 read
20110214nips2010 read20110214nips2010 read
20110214nips2010 readsesejun
 
Datamining 9th association_rule.key
Datamining 9th association_rule.keyDatamining 9th association_rule.key
Datamining 9th association_rule.keysesejun
 
Datamining 8th hclustering
Datamining 8th hclusteringDatamining 8th hclustering
Datamining 8th hclusteringsesejun
 
Datamining r 4th
Datamining r 4thDatamining r 4th
Datamining r 4thsesejun
 
Datamining r 3rd
Datamining r 3rdDatamining r 3rd
Datamining r 3rdsesejun
 
Datamining r 2nd
Datamining r 2ndDatamining r 2nd
Datamining r 2ndsesejun
 
Datamining r 1st
Datamining r 1stDatamining r 1st
Datamining r 1stsesejun
 
Datamining 5th knn
Datamining 5th knnDatamining 5th knn
Datamining 5th knnsesejun
 
Datamining 4th adaboost
Datamining 4th adaboostDatamining 4th adaboost
Datamining 4th adaboostsesejun
 
Datamining 3rd naivebayes
Datamining 3rd naivebayesDatamining 3rd naivebayes
Datamining 3rd naivebayessesejun
 
Datamining 2nd decisiontree
Datamining 2nd decisiontreeDatamining 2nd decisiontree
Datamining 2nd decisiontreesesejun
 
100401 Bioinfoinfra
100401 Bioinfoinfra100401 Bioinfoinfra
100401 Bioinfoinfrasesejun
 
Datamining 8th Hclustering
Datamining 8th HclusteringDatamining 8th Hclustering
Datamining 8th Hclusteringsesejun
 
Datamining 9th Association Rule
Datamining 9th Association RuleDatamining 9th Association Rule
Datamining 9th Association Rulesesejun
 

Mehr von sesejun (20)

RNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A ReviewRNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A Review
 
バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析
 
次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習
 
20110602labseminar pub
20110602labseminar pub20110602labseminar pub
20110602labseminar pub
 
20110524zurichngs 2nd pub
20110524zurichngs 2nd pub20110524zurichngs 2nd pub
20110524zurichngs 2nd pub
 
20110524zurichngs 1st pub
20110524zurichngs 1st pub20110524zurichngs 1st pub
20110524zurichngs 1st pub
 
20110214nips2010 read
20110214nips2010 read20110214nips2010 read
20110214nips2010 read
 
Datamining 9th association_rule.key
Datamining 9th association_rule.keyDatamining 9th association_rule.key
Datamining 9th association_rule.key
 
Datamining 8th hclustering
Datamining 8th hclusteringDatamining 8th hclustering
Datamining 8th hclustering
 
Datamining r 4th
Datamining r 4thDatamining r 4th
Datamining r 4th
 
Datamining r 3rd
Datamining r 3rdDatamining r 3rd
Datamining r 3rd
 
Datamining r 2nd
Datamining r 2ndDatamining r 2nd
Datamining r 2nd
 
Datamining r 1st
Datamining r 1stDatamining r 1st
Datamining r 1st
 
Datamining 5th knn
Datamining 5th knnDatamining 5th knn
Datamining 5th knn
 
Datamining 4th adaboost
Datamining 4th adaboostDatamining 4th adaboost
Datamining 4th adaboost
 
Datamining 3rd naivebayes
Datamining 3rd naivebayesDatamining 3rd naivebayes
Datamining 3rd naivebayes
 
Datamining 2nd decisiontree
Datamining 2nd decisiontreeDatamining 2nd decisiontree
Datamining 2nd decisiontree
 
100401 Bioinfoinfra
100401 Bioinfoinfra100401 Bioinfoinfra
100401 Bioinfoinfra
 
Datamining 8th Hclustering
Datamining 8th HclusteringDatamining 8th Hclustering
Datamining 8th Hclustering
 
Datamining 9th Association Rule
Datamining 9th Association RuleDatamining 9th Association Rule
Datamining 9th Association Rule
 

Kürzlich hochgeladen

presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...apidays
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?Igalia
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024SynarionITSolutions
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodJuan lago vázquez
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024The Digital Insurer
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...apidays
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 

Kürzlich hochgeladen (20)

presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 

Datamining 6th Svm

  • 1.
  • 2.
  • 3. k-NN • Yes, No Training Data Test Data 3
  • 4. 4
  • 5. xi yi i 1 -1 (xi , yi )(i = 1, . . . , l, xi ∈ Rn , yi ∈ {1, −1}) • w, b yi (w · (xi − b)) > 0 (i = 0, . . . , l) 5
  • 6. w.x+b≧0 • 1, x d(x) if w · x + b ≥ 0 d(x) = −1, otherwise • • 6
  • 7. Fisher (1) • 2 2 • w w+b=0 • w x+b=0 7
  • 8. Fisher (2) • m+ m- d(x)=1 x d(x)=−1 x m+ = , m− = |{x|d(x) = 1}| |{x|d(x) = −1}| • |(m+ − m− ) · w| • • w.x+b=0 2 2 ((x − m+ ) · w) + ((x − m− ) · w) d(x)=1 d(x)=−1 • 8
  • 9. Fisher (3) • • |w|=1 J(w) w • w.x+b • b 2 |(m+ − m− ) · w| J(w) = 2 2 d(x)=1 ((x − m+ ) · w) + d(x)=−1 ((x − m− ) · w) J(w) w J(w) w 0 9
  • 10. Fisher (4) J(w) w T SB w J(w) = w T SW w SB = (m+ − m− )(m+ − m− )T SW = (x − m+ )(x − m+ )T + (x − m− )(x − m− )T d(x)=1 d(x)=−1 ∂J(w) 0 =0 ∂w f f g − fg = g g2 (wT SB w)SW w = (wT SW w)SB w 2 SB w m+ − m− w ∝ S−1 (m+ − m− ) W Sw 10
  • 11. SVM (Support Vector Machine) • • • 11
  • 12. ρ(w,b) xi · w xi · w ρ(w, b) = min − max {xi |yi =1} |w| {xi |yi =−1} |w| 12
  • 13. 2 w0 · x + b0 = ±1 w0, b0 xi · w0 xi · w0 ρ(w0 , b0 ) = min − max {xi |yi =1} |w0 | {xi |yi =−1} |w0 | 1 − b0 −1 − b0 2 = − = |w0 | |w0 | |w0 | 13
  • 14. 2/|w0 | w0 · w0 yi (w0 · xi + b) ≥ 1 (i = 1, . . . , l) w0 · w0 w0 • 2 2 • 2 • 1 • 2 • 14
  • 15. (1) yi (w0 · xi + b) ≥ 1 (i = 1, . . . , l) (1) w0 · w0 w0 Λ = (α1 , . . . , αl ) (αi ≥ 0) l |w|2 L(w, b, Λ) = − αi (yi (xi · w + b) − 1) 2 i=1 • w, b Λ 15
  • 16. (2) • w=w0, b=b0 L(w, b, Λ) l ∂L(w, b, Λ) = w0 − αi yi xi = 0 ∂w w=w0 l i=1 (2) ∂L(w, b, Λ) = − αi yi = 0 ∂b b=b0 i=1 l l w0 = αi yi xi , αi yi = 0 i=1 i=1 • w=w0, b=b0 l 1 L(w0 , b0 , Λ) = w0 · w0 − αi [yi (xi · w0 + b0 ) − 1] 2 i=1 l l l 1 = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 • w b Λ 16
  • 17. SVM • l w, b αi yi = 0, αi ≥ 0 i=1 (3) l l l 1 L(w0 , b0 , Λ) = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 Λ • SVM • w0 Λ l • (2) ( w0 = i=1 αi yi xi ) • (2) αi≠0 xi w KKKT • KKT : αi [yi (xi · w0 + b0 ) − 1] = 0 17
  • 18. • • • (A) (B) 18
  • 19. ( ) • • • • • l l l 1 L(w0 , b0 , Λ) = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 • x l Φ(x) l l 1 L(w0 , b0 , Λ) = αi − αi αj yi yj Φ(xi ) · Φ(xj ) i=1 2 i=1 j=1 • l Φ(x) · w0 + b0 = αi yi Φ(x) · Φ(xi ) + b0 = 0 i=1 • Φ 19
  • 20. Kernel • K(x, y) = Φ(x) √ Φ(y) √ √ • Φ((x1 , x2 )) = (x1 , 2x1 x2 , x2 , 2x1 , 2x2 , 1) 2 2 Φ((x1 , x2 )) · Φ((y1 , y2 )) = (x1 y1 )2 + 2x1 y1 x2 y2 + (x2 y2 )2 + 2x1 y1 + 2x2 y2 + 1 = (x1 y1 + x2 y2 + 1)2 = ((x1 , x2 ) · (y1 , y2 ) + 1)2 • (6 ) • • (x · y + 1)d , • RBF exp(−||x − y||2 /2σ 2 ), • tanh(κx · y − δ) • σ κ δ • Mercer 20
  • 21. • • • • ξ yi (w · xi + b) ≥ 1 − ξi where ξi ≥ 0 (i = 1, . . . , l) l 1 w·w+C ξi 2 i=1 21
  • 22. (1) • Λ = (α1 , . . . , αl ), R = (r1 , . . . , rl ) L L(w, ξ, b, Λ, R) l l l 1 = w·w+C ξi − αi [yi (xi · w + b) − 1 + ξi ] − ri ξi 2 i=1 i=1 i=1 w0 , b0 , ξi L 0 w, b, ξi KKT l ∂L(w, ξ, b, Λ, R) = w0 − α i y i xi = 0 ∂w w=w0 i=0 l ∂L(w, ξ, b, Λ, R) = − αi yi = 0 ∂b b=b0 i=0 ∂L(w, ξ, b, Λ, R) = C − αi − ri = 0 ∂ξi 0 ξ=ξi 22
  • 23. (2) • l L 1 l l L(w, ξ, b, Λ, R) = αi − αi αj yi yj xi · xj 2 • i=1 i=1 j=1 C ξ SVM • αi C • C • C - αi - ri = 0 ri 0≦αi≦C l w,b αi yi = 0, 0 ≤ αi ≤ C i=1 l l l 1 L(w, ξ, b, Λ, R) = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 Λ 23
  • 24. : Karush-Kuhn-Tucker (KKT ) • • gi(x) ≦ 0 (x = (x1, x2, ..., xn)) f(x) • KKT : m ∂f (x) ∂gi (x) + λi = 0, j = 1, 2, ..., n ∂xj i=1 ∂xj λi gi (x) = 0, λi ≥ 0, gi (x) ≤ 0, i = 1, 2, ..., m • f(x) gi(x) x, λ KKT f(x) 24
  • 25. SMO (Sequence Minimal Optimization) • SVM • Λ=(α1, α2, ...,αl) • αi • 6000 6000 • • 2 (αi, αj) 2 • 2 αi • SMO • LD l l l 1 LD = L(w, ξ, b, Λ, R) = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 25
  • 26. 2 (1) • α 1 , α2 LD • old old α 1 , α2 new new α 1 , α2 Ei ≡ wold · xi + bold − yi old η ≡ 2K12 − K11 − K22 , where Kij = xi · xj α2 y2 (E1 − E2 ) old old new α2 = α2 − old η l i=1 αi y i = 0 γ ≡ α1 + sα2 = Const. LD LD’=0 η = 2K12 − K11 − K22 = − | x2 − x1 |2 ≤ 0 26
  • 27. 2 (2) • α 1 , α2 γ ≡ α1 + sα2 = Const. • new new α 1 , α2 0 C • α2 clipped α2 (A) (B) 27
  • 28. 2 (3) y1 = y1 (s = 1) L = max(0, α1 + α2 − C), old old H = min(C, α1 + α2 ) old old y1 = y2 (s = −1) L = max(0, α2 − α1 ), old old H = min(C, C + α2 − α1 ) old old L ≤ α2 ≤ H s γ clipped α2   H, if α2 ≥ H new clipped α2 = new α2 , if L < α2 < H new  L, if α2 ≤ L new LD 28
  • 29. L ≤ α2 ≤ H (A) (B) (C) (D)
  • 30. clipped α2 (B) (C) (A) (D) : (α1 , α2 ) new new clipped : (α1 , α2 new )
  • 31. 2 1. η = 2K12 − K11 − K22 2. η < 0 α old old y2 (E2 −E1 ) (a) α2 = α2 + new old η clipped (b) α2 clipped (c) α1 = α1 − s(α2 new old − α2 ) old 3. η = 0 LD α2 1 L H α1 2(c) 4. α1,2 • bnew E new = 0 clipped wnew = wold + (α1 − α1 )y1 x1 + (α2 new old − α2 )y2 x2 old E new (x, y) = E old (x, y) + y1 (α1 − α1 )x1 · x new old clipped +y2 (α2 − α2 )x2 · x − bold + bnew old clipped bnew = bold − E old (x, y) − y1 (α1 − α1 )x1 · x − y2 (α2 new old − α2 )x2 · x old 31
  • 32. αi • α1 α2 • α1 • KKT KKT • • 2 • 0 < αi < C • • α2 • LD • |E1-E2| • E1 E2 E1 32
  • 33. SMO SVM • • • α≠0 • α 2 • 2 • 2 α • |E2-E1| • LD KKT 33
  • 34. 3 ( ) • A B 2 • • (regression problem) • 0 100 0 10, 10 20, • 1 • Web 100 100 Web • • One Class SVM 34