SlideShare ist ein Scribd-Unternehmen logo
1 von 30
(AdaBoost)
•               Naive Bayes      Yes, No              ○   ×


    •                            ○   ×        1   0
•       3
•                     (Weighted Voting)


    •
        •
            •
    •
        •
•
•        1   0
•                   N
•                (Xi , ci )(i = 1, . . . , N )   C

    XC = (No, Yes, Yes, Yes), cC = 1
•   R:
    •
•
    •
                           1
         i                wi
    •
                  N
             wi = 1/N (i = 1, . . . , N )
              1

    •                    10
             wi = 1/10
              1
•        t=1,...,R
1.             : t                                                      i
                                            t
          pt
           i                   pt =
                                           wi
                                i          N      t
                                           i=1   wi
                           N

     •                                                p1 = wi = 1/10
                                                            1
         t=1                      t
                                 wi   =1               i
                           i=1


2.                                          WeakLearner
           WeakLearner                                (   t   < 1/2 )        ht
                                              t
                                 N
                       t   =          pt |ht (Xi ) − ci | < 1/2
                                       i
                               i=1


                                            0, ht (Xi )        ci
               ht (Xi ) − ci         =
                                            1,

                                                                    Step 3
•   t=1   WeakLearner




               ID     A F                     h1(Xi) = 1
                E,F             h1(Xi)=ci   D,J ci=0
               ID     G,H,I,J                  h1(Xi) = 0


                                WeakLearner
               10                2
                p1 = 1/10
                 i
                 1 = 1/10 × 2 = 1/5 < 1/2
                                 T2
               WeakLearner
t+1
3.           wi                                                βt
                            t
                    βt =                  0≤        < 1/2,   0 ≤ βt < 1
                           1−    t
                                                t

     •   β
                                     1−|ht (Xi )−ci |
                  wi = wi βt
                   t+1  t


         •          εt               βt
         •   εt            βt
         •                      WeakLearner             ht
•                                t=1
           1 = 0.2
                   1
          β1 =             = 0.2/0.8 = 0.25
                1−     1
    •          A D, G J            ht         E, F

     2
    wA   = wB = wC = wD = wG = wH = wI = wJ
            2     2     2     2 2    2    2

         = 1/10 × β1 = 0.025
                   1
     2
    wE   = wF = 1/10 × β1 = 0.1
            2            0
WeakLearner
•   t=1               WeakLearner       T2
•               WeakLearner
    •                                        WeakLearner
    •
        WeakLearner
•   t=1
    •      T1   Yes       ε1   ε1(T1=Yes)
p1 = 1/10(i ∈ {A, B, ..., J})
 i
                            1
        1 (T1   = Yes) =        × 4 = 0.4
                           10
                            1
        1 (T2   = Yes) =        × 2 = 0.2
                           10
                            1
        1 (T3   = Yes) =        × 3 = 0.3
                           10
                            1
        1 (T4   = Yes) =        × 2 = 0.2
                           10
                                T2=Yes      T4=Yes
                                    T2
•   t=2
     wA2
              = wB = wC = wD = wG = wH = wI = wJ
                 2     2     2     2 2    2    2

              = 1/10 × β1 = 0.025
                        1
      2
     wE       = wF = 1/10 × β1 = 0.1
                 2            0



                  wi = 0.400
                   2

    i∈{A,...,J}

     p2 = p2
      A    B        =   p2 = p2 = p2 = p2 = p2 = p2
                         C     D    G     H  I    J
                    =   0.025/0.400 = 0.0625
     p2 = p2
      E    F        =   0.1/0.400 = 0.25

•                               WeakLearner
      2 (T1   = Yes)    =   0.0625 × 2 + 0.25 × 2 = 0.625
                             2 (T1 = No) = 0.375
      2 (T2 = Yes)      =   0.25 + 0.25 = 0.5
      2 (T3 = Yes)      =   0.0625 × 2 + 0.25 × 1 = 0.3125
      2 (T4 = Yes)      =   0.0625 × 2 = 0.125
                                                       T4=Yes
β2 =   2 /(1   −   2)   = 0.143
R
                          1   if   t=1 (− log10   βt )ht (X) ≥   (− log10 βt ) 1
hf (X)      =                                                                  2
                          0   othrewise
                t
     βt =
            1−       t



 •                                                                     1,
                               0
     •   Log
 •                  εt    0               βt                 -Log βt
                         WeakLearner             ht(X)
 •                                     -Log βt               ht(X)
β1 = 0.25, β2 = 0.143
 2
                1
    (− log βt )
t=1
                2
                       1                1
     =   − log10 0.25 × − log10 0.143 ×
                       2                2
     =   0.723

 •                          1.
•   h1: T2=Yes         C=1, h2: T4=Yes   C=1

 2   •             K
      (− log βt )ht (XK )
t=1
     = − log10 0.25 × 0 − log10 0.143 × 1
     = 0.845
                   0.723                            ○


 2
     •             L
      (− log βt )ht (XL )
t=1
     =   − log10 0.25 × 1 − log10 0.143 × 0
     =   0.602        0.723                         ×
AdaBoost
•              hf                          ε
                    =                      D(i)
                        {i|hf (Xi )=yi }

    •   D(i)
•                   R
                                                  εt   t

               ≤          2     t (1   − t)
                    t=1
    •   t                              εt<1/2

               2     t (1     − t) < 1

    •                                           WeakLearner
                          ε
•           1
     •   R                                  R+1

                       N                    R         1/2
                             R+1
                            wi   ≥               βt
                      i=1                  t=1
 •
     •           N
                      R+1                               R+1
                     wi       ≥                        wi
             i=1                    {i|hf (Xi )=yi }



                                   t 1−|hf (Xi )−yi |
                      t+1
                     wi     =     wi βi
                                                             R
                                                                    1−|hf (Xi )−yi |
                      t+1
                     wi     =                         D(i)         βt
{i|hf (Xi )=yi }                  {i|hf (Xi )=yi }           t=1
R                    hf                               (hf (Xi ) = yi )
            1−|hf (Xi )−yi |
           βt                        hf (Xi ) = 1       yi = 0                 hf (Xi ) = 0
     t=1
                    2
             hf                             2
              hf (Xi ) = 1           yi = 0                              5.1
       hf (Xi ) = 1                yi = 0,          hf(Xi)=1
                     (hf (Xi ) = R i )
                                 y                               R
                                                                              1
Xi ) = 1          yi = 0              h(− log βt )hf (Xi ) y≥ = 1 (− log βt )
                                        f (Xi ) = 0         i
                                    t=1                       t=1
                                                                              2

                        R
 0                      t=1 (log5.1
                                βt )
                           R        R                                R
                                     1                                          1
− log βt )hf (Xi ) ≥     (− log(log βt )(1 − hf (Xi )) ≥
                                βt )                                  (log βt )
                                                                                2
                     t=1    t=1      2                            t=1

            1 − hf (Xi ) = 1− | hf (Xi ) − yi |

                                        R                        R          1/2
                               R
                                     1t f (Xi )−yi | ≥
                                       1−|h
g βt )(1 − hf (Xi )) ≥     (log βt ) β                                 βt
                                t=1 2                            t=1
                       t=1
t                              t
                           t=1                             t=1

      hf (Xi )hf (Xi ) = 0 yi = i1= 1
               =0              y                                         5.1
                          R                               R
                                                                 1
                           (− log βt )ht (Xi ) <     (− log βt )
                       t=1                       t=1
                                                                 2
174                  −1                  hf (Xi ) = 1− | ht (Xi ) 5 yi |
                                                                  −
                                                                         1
                          R                                   R          2
                                 1−|ht (Xi )−yi |
                                βt                    >             βt
                          t=1                                 t=1

                                  hf (Xi ) = 1                yi = 0           hf (Xi ) =
yi = 1
                                                                         1
                          R                                   R          2
                                 1−|ht (Xi )−yi |
                                βt                    ≥             βt
                          t=1                                 t=1
5.2
                                                                                                              1   
                          R                                                                    R               2
                                 1−|hf (Xi )−yi |                               D(i)                              
                   D(i)         βt                      ≥                                            βt
{i|hf (Xi )=yi }          t=1                                {i|hf (Xi )=yi }                  t=1
                                                                                                                 1
                                                                                                   R               2

                                                        =                          D(i)                 βt
                                                                 {i|hf (Xi )=yi }               t=1



                                                                                         1
                                                                                 R         2
                                 R+1
                                wi   ≥                            D(i)              βt
             {i|hf (Xi )=yi }                  {i|hf (Xi )=yi }                 t=1
                                                               1
                                                    R          2

                                       =     ·          βt
                                                  t=1




      5.2
= ·
                              t=1            βt
                                       t=1

•         2
5.2 •   5.2        N           N
                       t+1
                      wi N ≥       t
                                  wi   N×     2 i
                  i=1
                             t+1
                           wi i=1≥           wi × 2
                                               t
                                                      i
 •
: α≥0         r = {0, 1}
                        i=1            i=1

        : α≥0       r = {0, 1}
                     αr ≤ 1 − (1 − α)r
                           αr ≤ 1 − (1 − α)r
5.6.                                                                                     175
       N              N
                                1−|ht (Xi )−yi |
              t+1
             wi   =          t
                            wi βt
       i=1            i=1
                       N
                 ≤          wi (1 − (1 − βt )(1 − |ht (Xi ) − yi |))
                             t

                      i=1
                       N                           N          N
                 =          wi − (1 − βt )
                             t                          t
                                                       wi −        wi |ht (Xi ) − yi |
                                                                    t

                      i=1                      i=1            i=1
                       N                        N                N
                 =          wi − (1 − βt )
                             t                          t
                                                       wi −   t
                                                                         t
                                                                        wi
                      i=1                      i=1                i=1
                       N                        N
                 =          wi − (1 − βt )
                             t                          t
                                                       wi   (1 − t )
                      i=1                      i=1
                        N
                 =             t
                              wi    × (1 − (1 − βt )(1 − t ))
                       i=1
                       βt =     t /(1   − t)
                          N
                 =             t
                              wi    ×2    t
                       i=1
•
5.3                 t                                      WeakLearner
          •
          t
               εt        t                                WeakLearner
                                          hf                                         hf                 ε
                                                     R
                                            ≤             2        t (1   − t)
      •                      1/2
                                                 t=1
              R                             N                        N                                 N          R
:         5.1 βt                     ≤ 5.2 wi
                                            R+1
                                                ≤                          R
                                                                          wi          ×2   t   ≤              1
                                                                                                             wi         2   t
              t=1                                                                                                 t=1
                         R           1/2 i=1 N                      i=1                                i=1
                                   N

                              βt         wi = 1
                                          1
                                            ≤              R+1
                                                          wi   (                     5.1           )
                        t=1        i=1
                                                 i=1
                                            R
                                                   N
                                     =           2
                                            ≤
                                           t=1
                                                     t          R
                                                               wi         × 2 t(               5.2            )
                                                         i=1
          βt =          t /(1 − t )                  5.2           t = R − 1, R − 2, . . . , 1
                         R                                     R
                                           −1/2 N                         R
              ≤                2 t×       βt    =                12           t (1   − t)
                                           ≤                   wi              2     t
                        t=1                                    t=1
•       3
    •
        •
    •              (Weighted Voting)
•
•   WeakLearner                        WeakLearner
                         NaiveBayes                    K-NN,
    SVM
    •                                                Ensemble
        Learning
4   5   2   A   9   3   1   C

    1                                   8   6   7   B
            2           4   5   2   A
4
                                        4   5   2   A   8   6   7   B
                3       8   6   7   B
8       6
                    5                   9   3   1   C
    9           7
                        9   3   1   C
                                        8   6   7   B   4   5   2   A

                                        9   3   1   C
1                   1   2   3   4   5   6   7   8   9
            2
4                       1   2   3   4   5   6   7   9   8
                3
8       6               1   2   3   4   5   6   8   9   7
                    5
    9           7       1   2   3   4   5   7   8   9   6

                        1   2   3   4   6   7   8   9   5

                        1   2   3   5   6   7   8   9   4

                        1   2   4   5   6   7   8   9   3

                        1   3   4   5   6   7   8   9   2

                        2   3   4   5   6   7   8   9   1
Datamining 4th Adaboost

Weitere ähnliche Inhalte

Was ist angesagt?

Formulario cuantica 2
Formulario cuantica 2Formulario cuantica 2
Formulario cuantica 2Abraham Prado
 
Frank_NeuroInformatics11.pdf
Frank_NeuroInformatics11.pdfFrank_NeuroInformatics11.pdf
Frank_NeuroInformatics11.pdffrankmj
 
Asset Prices in Segmented and Integrated Markets
Asset Prices in Segmented and Integrated MarketsAsset Prices in Segmented and Integrated Markets
Asset Prices in Segmented and Integrated Marketsguasoni
 
Numerical solution of spatiotemporal models from ecology
Numerical solution of spatiotemporal models from ecologyNumerical solution of spatiotemporal models from ecology
Numerical solution of spatiotemporal models from ecologyKyrre Wahl Kongsgård
 
State Space Model
State Space ModelState Space Model
State Space ModelCdiscount
 
Bai tap-prolog-da-tap-hop-9889
Bai tap-prolog-da-tap-hop-9889Bai tap-prolog-da-tap-hop-9889
Bai tap-prolog-da-tap-hop-9889anhsaobang1289
 
PaperNo13-Habibi-IMF
PaperNo13-Habibi-IMFPaperNo13-Habibi-IMF
PaperNo13-Habibi-IMFMezban Habibi
 
Gaussseidelsor
GaussseidelsorGaussseidelsor
Gaussseidelsoruis
 
Neural network Algos formulas
Neural network Algos formulasNeural network Algos formulas
Neural network Algos formulasZarnigar Altaf
 
04 structured prediction and energy minimization part 1
04 structured prediction and energy minimization part 104 structured prediction and energy minimization part 1
04 structured prediction and energy minimization part 1zukun
 
Saddlepoint approximations, likelihood asymptotics, and approximate condition...
Saddlepoint approximations, likelihood asymptotics, and approximate condition...Saddlepoint approximations, likelihood asymptotics, and approximate condition...
Saddlepoint approximations, likelihood asymptotics, and approximate condition...jaredtobin
 
On Foundations of Parameter Estimation for Generalized Partial Linear Models ...
On Foundations of Parameter Estimation for Generalized Partial Linear Models ...On Foundations of Parameter Estimation for Generalized Partial Linear Models ...
On Foundations of Parameter Estimation for Generalized Partial Linear Models ...SSA KPI
 

Was ist angesagt? (19)

Formulario cuantica 2
Formulario cuantica 2Formulario cuantica 2
Formulario cuantica 2
 
Biomathematics
BiomathematicsBiomathematics
Biomathematics
 
Frank_NeuroInformatics11.pdf
Frank_NeuroInformatics11.pdfFrank_NeuroInformatics11.pdf
Frank_NeuroInformatics11.pdf
 
Asset Prices in Segmented and Integrated Markets
Asset Prices in Segmented and Integrated MarketsAsset Prices in Segmented and Integrated Markets
Asset Prices in Segmented and Integrated Markets
 
Numerical solution of spatiotemporal models from ecology
Numerical solution of spatiotemporal models from ecologyNumerical solution of spatiotemporal models from ecology
Numerical solution of spatiotemporal models from ecology
 
Algo Final
Algo FinalAlgo Final
Algo Final
 
State Space Model
State Space ModelState Space Model
State Space Model
 
Bai tap-prolog-da-tap-hop-9889
Bai tap-prolog-da-tap-hop-9889Bai tap-prolog-da-tap-hop-9889
Bai tap-prolog-da-tap-hop-9889
 
Slides
SlidesSlides
Slides
 
CME Deliverable Interest Rate Swap Future
CME Deliverable Interest Rate Swap FutureCME Deliverable Interest Rate Swap Future
CME Deliverable Interest Rate Swap Future
 
PaperNo13-Habibi-IMF
PaperNo13-Habibi-IMFPaperNo13-Habibi-IMF
PaperNo13-Habibi-IMF
 
Rouviere
RouviereRouviere
Rouviere
 
Gaussseidelsor
GaussseidelsorGaussseidelsor
Gaussseidelsor
 
Neural network Algos formulas
Neural network Algos formulasNeural network Algos formulas
Neural network Algos formulas
 
Chapter 06
Chapter 06Chapter 06
Chapter 06
 
04 structured prediction and energy minimization part 1
04 structured prediction and energy minimization part 104 structured prediction and energy minimization part 1
04 structured prediction and energy minimization part 1
 
確率伝播
確率伝播確率伝播
確率伝播
 
Saddlepoint approximations, likelihood asymptotics, and approximate condition...
Saddlepoint approximations, likelihood asymptotics, and approximate condition...Saddlepoint approximations, likelihood asymptotics, and approximate condition...
Saddlepoint approximations, likelihood asymptotics, and approximate condition...
 
On Foundations of Parameter Estimation for Generalized Partial Linear Models ...
On Foundations of Parameter Estimation for Generalized Partial Linear Models ...On Foundations of Parameter Estimation for Generalized Partial Linear Models ...
On Foundations of Parameter Estimation for Generalized Partial Linear Models ...
 

Andere mochten auch

Classifications & Misclassifications of EEG Signals using Linear and AdaBoost...
Classifications & Misclassifications of EEG Signals using Linear and AdaBoost...Classifications & Misclassifications of EEG Signals using Linear and AdaBoost...
Classifications & Misclassifications of EEG Signals using Linear and AdaBoost...IJARIIT
 
Kato Mivule: An Overview of Adaptive Boosting – AdaBoost
Kato Mivule: An Overview of  Adaptive Boosting – AdaBoostKato Mivule: An Overview of  Adaptive Boosting – AdaBoost
Kato Mivule: An Overview of Adaptive Boosting – AdaBoostKato Mivule
 
Machine learning with ADA Boost
Machine learning with ADA BoostMachine learning with ADA Boost
Machine learning with ADA BoostAman Patel
 
2013-1 Machine Learning Lecture 06 - Artur Ferreira - A Survey on Boosting…
2013-1 Machine Learning Lecture 06 - Artur Ferreira - A Survey on Boosting…2013-1 Machine Learning Lecture 06 - Artur Ferreira - A Survey on Boosting…
2013-1 Machine Learning Lecture 06 - Artur Ferreira - A Survey on Boosting…Dongseo University
 
Assistat professor interview questions in eng. colleges
Assistat professor interview questions in eng. collegesAssistat professor interview questions in eng. colleges
Assistat professor interview questions in eng. collegesreddyprasad reddyvari
 

Andere mochten auch (9)

Classifications & Misclassifications of EEG Signals using Linear and AdaBoost...
Classifications & Misclassifications of EEG Signals using Linear and AdaBoost...Classifications & Misclassifications of EEG Signals using Linear and AdaBoost...
Classifications & Misclassifications of EEG Signals using Linear and AdaBoost...
 
boosting algorithm
boosting algorithmboosting algorithm
boosting algorithm
 
Multiple Classifier Systems
Multiple Classifier SystemsMultiple Classifier Systems
Multiple Classifier Systems
 
Kato Mivule: An Overview of Adaptive Boosting – AdaBoost
Kato Mivule: An Overview of  Adaptive Boosting – AdaBoostKato Mivule: An Overview of  Adaptive Boosting – AdaBoost
Kato Mivule: An Overview of Adaptive Boosting – AdaBoost
 
Machine learning with ADA Boost
Machine learning with ADA BoostMachine learning with ADA Boost
Machine learning with ADA Boost
 
2013-1 Machine Learning Lecture 06 - Artur Ferreira - A Survey on Boosting…
2013-1 Machine Learning Lecture 06 - Artur Ferreira - A Survey on Boosting…2013-1 Machine Learning Lecture 06 - Artur Ferreira - A Survey on Boosting…
2013-1 Machine Learning Lecture 06 - Artur Ferreira - A Survey on Boosting…
 
Ada boost
Ada boostAda boost
Ada boost
 
Ada boost
Ada boostAda boost
Ada boost
 
Assistat professor interview questions in eng. colleges
Assistat professor interview questions in eng. collegesAssistat professor interview questions in eng. colleges
Assistat professor interview questions in eng. colleges
 

Ähnlich wie Datamining 4th Adaboost

Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2ybenjo
 
Kinematika rotasi
Kinematika rotasiKinematika rotasi
Kinematika rotasirymmanz86
 
Dsp U Lec10 DFT And FFT
Dsp U   Lec10  DFT And  FFTDsp U   Lec10  DFT And  FFT
Dsp U Lec10 DFT And FFTtaha25
 
Ada boost brown boost performance with noisy data
Ada boost brown boost performance with noisy dataAda boost brown boost performance with noisy data
Ada boost brown boost performance with noisy dataShadhin Rahman
 
Ct signal operations
Ct signal operationsCt signal operations
Ct signal operationsmihir jain
 
SIGNAL OPERATIONS
SIGNAL OPERATIONSSIGNAL OPERATIONS
SIGNAL OPERATIONSmihir jain
 
Unit 1 Operation on signals
Unit 1  Operation on signalsUnit 1  Operation on signals
Unit 1 Operation on signalsDr.SHANTHI K.G
 
Lesson 7: Vector-valued functions
Lesson 7: Vector-valued functionsLesson 7: Vector-valued functions
Lesson 7: Vector-valued functionsMatthew Leingang
 
Two Dimensional Unsteady Heat Conduction in T-shaped Plate.pdf
Two Dimensional Unsteady Heat Conduction in T-shaped Plate.pdfTwo Dimensional Unsteady Heat Conduction in T-shaped Plate.pdf
Two Dimensional Unsteady Heat Conduction in T-shaped Plate.pdfShehzaib Yousuf Khan
 
Tele3113 wk11wed
Tele3113 wk11wedTele3113 wk11wed
Tele3113 wk11wedVin Voro
 
Case Study (All)
Case Study (All)Case Study (All)
Case Study (All)gudeyi
 
Datamining 2nd decisiontree
Datamining 2nd decisiontreeDatamining 2nd decisiontree
Datamining 2nd decisiontreesesejun
 
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptxEs400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptxumavijay
 
Sheet1 simplified
Sheet1 simplifiedSheet1 simplified
Sheet1 simplifiedmarwan a
 

Ähnlich wie Datamining 4th Adaboost (20)

Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2
 
Kinematika rotasi
Kinematika rotasiKinematika rotasi
Kinematika rotasi
 
Dsp U Lec10 DFT And FFT
Dsp U   Lec10  DFT And  FFTDsp U   Lec10  DFT And  FFT
Dsp U Lec10 DFT And FFT
 
Ada boost brown boost performance with noisy data
Ada boost brown boost performance with noisy dataAda boost brown boost performance with noisy data
Ada boost brown boost performance with noisy data
 
Sol7
Sol7Sol7
Sol7
 
Ct signal operations
Ct signal operationsCt signal operations
Ct signal operations
 
SIGNAL OPERATIONS
SIGNAL OPERATIONSSIGNAL OPERATIONS
SIGNAL OPERATIONS
 
Unit 1 Operation on signals
Unit 1  Operation on signalsUnit 1  Operation on signals
Unit 1 Operation on signals
 
Lesson 7: Vector-valued functions
Lesson 7: Vector-valued functionsLesson 7: Vector-valued functions
Lesson 7: Vector-valued functions
 
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
 
Two Dimensional Unsteady Heat Conduction in T-shaped Plate.pdf
Two Dimensional Unsteady Heat Conduction in T-shaped Plate.pdfTwo Dimensional Unsteady Heat Conduction in T-shaped Plate.pdf
Two Dimensional Unsteady Heat Conduction in T-shaped Plate.pdf
 
Energy
EnergyEnergy
Energy
 
Ch15s
Ch15sCh15s
Ch15s
 
Tele3113 wk11wed
Tele3113 wk11wedTele3113 wk11wed
Tele3113 wk11wed
 
Testsol
TestsolTestsol
Testsol
 
Case Study (All)
Case Study (All)Case Study (All)
Case Study (All)
 
Datamining 2nd decisiontree
Datamining 2nd decisiontreeDatamining 2nd decisiontree
Datamining 2nd decisiontree
 
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptxEs400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
 
Sheet1 simplified
Sheet1 simplifiedSheet1 simplified
Sheet1 simplified
 
Ch07s
Ch07sCh07s
Ch07s
 

Mehr von sesejun

RNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A ReviewRNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A Reviewsesejun
 
バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析sesejun
 
次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習sesejun
 
20110602labseminar pub
20110602labseminar pub20110602labseminar pub
20110602labseminar pubsesejun
 
20110524zurichngs 2nd pub
20110524zurichngs 2nd pub20110524zurichngs 2nd pub
20110524zurichngs 2nd pubsesejun
 
20110524zurichngs 1st pub
20110524zurichngs 1st pub20110524zurichngs 1st pub
20110524zurichngs 1st pubsesejun
 
20110214nips2010 read
20110214nips2010 read20110214nips2010 read
20110214nips2010 readsesejun
 
Datamining 9th association_rule.key
Datamining 9th association_rule.keyDatamining 9th association_rule.key
Datamining 9th association_rule.keysesejun
 
Datamining 8th hclustering
Datamining 8th hclusteringDatamining 8th hclustering
Datamining 8th hclusteringsesejun
 
Datamining r 4th
Datamining r 4thDatamining r 4th
Datamining r 4thsesejun
 
Datamining r 3rd
Datamining r 3rdDatamining r 3rd
Datamining r 3rdsesejun
 
Datamining r 2nd
Datamining r 2ndDatamining r 2nd
Datamining r 2ndsesejun
 
Datamining r 1st
Datamining r 1stDatamining r 1st
Datamining r 1stsesejun
 
Datamining 6th svm
Datamining 6th svmDatamining 6th svm
Datamining 6th svmsesejun
 
Datamining 5th knn
Datamining 5th knnDatamining 5th knn
Datamining 5th knnsesejun
 
Datamining 4th adaboost
Datamining 4th adaboostDatamining 4th adaboost
Datamining 4th adaboostsesejun
 
Datamining 3rd naivebayes
Datamining 3rd naivebayesDatamining 3rd naivebayes
Datamining 3rd naivebayessesejun
 
Datamining 7th kmeans
Datamining 7th kmeansDatamining 7th kmeans
Datamining 7th kmeanssesejun
 
100401 Bioinfoinfra
100401 Bioinfoinfra100401 Bioinfoinfra
100401 Bioinfoinfrasesejun
 
Datamining 8th Hclustering
Datamining 8th HclusteringDatamining 8th Hclustering
Datamining 8th Hclusteringsesejun
 

Mehr von sesejun (20)

RNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A ReviewRNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A Review
 
バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析
 
次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習
 
20110602labseminar pub
20110602labseminar pub20110602labseminar pub
20110602labseminar pub
 
20110524zurichngs 2nd pub
20110524zurichngs 2nd pub20110524zurichngs 2nd pub
20110524zurichngs 2nd pub
 
20110524zurichngs 1st pub
20110524zurichngs 1st pub20110524zurichngs 1st pub
20110524zurichngs 1st pub
 
20110214nips2010 read
20110214nips2010 read20110214nips2010 read
20110214nips2010 read
 
Datamining 9th association_rule.key
Datamining 9th association_rule.keyDatamining 9th association_rule.key
Datamining 9th association_rule.key
 
Datamining 8th hclustering
Datamining 8th hclusteringDatamining 8th hclustering
Datamining 8th hclustering
 
Datamining r 4th
Datamining r 4thDatamining r 4th
Datamining r 4th
 
Datamining r 3rd
Datamining r 3rdDatamining r 3rd
Datamining r 3rd
 
Datamining r 2nd
Datamining r 2ndDatamining r 2nd
Datamining r 2nd
 
Datamining r 1st
Datamining r 1stDatamining r 1st
Datamining r 1st
 
Datamining 6th svm
Datamining 6th svmDatamining 6th svm
Datamining 6th svm
 
Datamining 5th knn
Datamining 5th knnDatamining 5th knn
Datamining 5th knn
 
Datamining 4th adaboost
Datamining 4th adaboostDatamining 4th adaboost
Datamining 4th adaboost
 
Datamining 3rd naivebayes
Datamining 3rd naivebayesDatamining 3rd naivebayes
Datamining 3rd naivebayes
 
Datamining 7th kmeans
Datamining 7th kmeansDatamining 7th kmeans
Datamining 7th kmeans
 
100401 Bioinfoinfra
100401 Bioinfoinfra100401 Bioinfoinfra
100401 Bioinfoinfra
 
Datamining 8th Hclustering
Datamining 8th HclusteringDatamining 8th Hclustering
Datamining 8th Hclustering
 

Kürzlich hochgeladen

Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.pptRamjanShidvankar
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Jisc
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...Poonam Aher Patil
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...Nguyen Thanh Tu Collection
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxRamakrishna Reddy Bijjam
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docxPoojaSen20
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseAnaAcapella
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxAmanpreet Kaur
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxVishalSingh1417
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentationcamerronhm
 

Kürzlich hochgeladen (20)

Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 

Datamining 4th Adaboost

  • 1.
  • 2. (AdaBoost) • Naive Bayes Yes, No ○ × • ○ × 1 0 • 3 • (Weighted Voting) • • • • • •
  • 3. 1 0 • N • (Xi , ci )(i = 1, . . . , N ) C XC = (No, Yes, Yes, Yes), cC = 1
  • 4. R: • • • 1 i wi • N wi = 1/N (i = 1, . . . , N ) 1 • 10 wi = 1/10 1
  • 5. t=1,...,R 1. : t i t pt i pt = wi i N t i=1 wi N • p1 = wi = 1/10 1 t=1 t wi =1 i i=1 2. WeakLearner WeakLearner ( t < 1/2 ) ht t N t = pt |ht (Xi ) − ci | < 1/2 i i=1 0, ht (Xi ) ci ht (Xi ) − ci = 1, Step 3
  • 6. t=1 WeakLearner ID A F h1(Xi) = 1 E,F h1(Xi)=ci D,J ci=0 ID G,H,I,J h1(Xi) = 0 WeakLearner 10 2 p1 = 1/10 i 1 = 1/10 × 2 = 1/5 < 1/2 T2 WeakLearner
  • 7. t+1 3. wi βt t βt = 0≤ < 1/2, 0 ≤ βt < 1 1− t t • β 1−|ht (Xi )−ci | wi = wi βt t+1 t • εt βt • εt βt • WeakLearner ht
  • 8. t=1 1 = 0.2 1 β1 = = 0.2/0.8 = 0.25 1− 1 • A D, G J ht E, F 2 wA = wB = wC = wD = wG = wH = wI = wJ 2 2 2 2 2 2 2 = 1/10 × β1 = 0.025 1 2 wE = wF = 1/10 × β1 = 0.1 2 0
  • 9. WeakLearner • t=1 WeakLearner T2 • WeakLearner • WeakLearner • WeakLearner • t=1 • T1 Yes ε1 ε1(T1=Yes)
  • 10. p1 = 1/10(i ∈ {A, B, ..., J}) i 1 1 (T1 = Yes) = × 4 = 0.4 10 1 1 (T2 = Yes) = × 2 = 0.2 10 1 1 (T3 = Yes) = × 3 = 0.3 10 1 1 (T4 = Yes) = × 2 = 0.2 10 T2=Yes T4=Yes T2
  • 11. t=2 wA2 = wB = wC = wD = wG = wH = wI = wJ 2 2 2 2 2 2 2 = 1/10 × β1 = 0.025 1 2 wE = wF = 1/10 × β1 = 0.1 2 0 wi = 0.400 2 i∈{A,...,J} p2 = p2 A B = p2 = p2 = p2 = p2 = p2 = p2 C D G H I J = 0.025/0.400 = 0.0625 p2 = p2 E F = 0.1/0.400 = 0.25 • WeakLearner 2 (T1 = Yes) = 0.0625 × 2 + 0.25 × 2 = 0.625 2 (T1 = No) = 0.375 2 (T2 = Yes) = 0.25 + 0.25 = 0.5 2 (T3 = Yes) = 0.0625 × 2 + 0.25 × 1 = 0.3125 2 (T4 = Yes) = 0.0625 × 2 = 0.125 T4=Yes
  • 12. β2 = 2 /(1 − 2) = 0.143
  • 13. R 1 if t=1 (− log10 βt )ht (X) ≥ (− log10 βt ) 1 hf (X) = 2 0 othrewise t βt = 1− t • 1, 0 • Log • εt 0 βt -Log βt WeakLearner ht(X) • -Log βt ht(X)
  • 14. β1 = 0.25, β2 = 0.143 2 1 (− log βt ) t=1 2 1 1 = − log10 0.25 × − log10 0.143 × 2 2 = 0.723 • 1.
  • 15. h1: T2=Yes C=1, h2: T4=Yes C=1 2 • K (− log βt )ht (XK ) t=1 = − log10 0.25 × 0 − log10 0.143 × 1 = 0.845 0.723 ○ 2 • L (− log βt )ht (XL ) t=1 = − log10 0.25 × 1 − log10 0.143 × 0 = 0.602 0.723 ×
  • 16. AdaBoost • hf ε = D(i) {i|hf (Xi )=yi } • D(i) • R εt t ≤ 2 t (1 − t) t=1 • t εt<1/2 2 t (1 − t) < 1 • WeakLearner ε
  • 17. 1 • R R+1 N R 1/2 R+1 wi ≥ βt i=1 t=1 • • N R+1 R+1 wi ≥ wi i=1 {i|hf (Xi )=yi } t 1−|hf (Xi )−yi | t+1 wi = wi βi R 1−|hf (Xi )−yi | t+1 wi = D(i) βt {i|hf (Xi )=yi } {i|hf (Xi )=yi } t=1
  • 18. R hf (hf (Xi ) = yi ) 1−|hf (Xi )−yi | βt hf (Xi ) = 1 yi = 0 hf (Xi ) = 0 t=1 2 hf 2 hf (Xi ) = 1 yi = 0 5.1 hf (Xi ) = 1 yi = 0, hf(Xi)=1 (hf (Xi ) = R i ) y R 1 Xi ) = 1 yi = 0 h(− log βt )hf (Xi ) y≥ = 1 (− log βt ) f (Xi ) = 0 i t=1 t=1 2 R 0 t=1 (log5.1 βt ) R R R 1 1 − log βt )hf (Xi ) ≥ (− log(log βt )(1 − hf (Xi )) ≥ βt ) (log βt ) 2 t=1 t=1 2 t=1 1 − hf (Xi ) = 1− | hf (Xi ) − yi | R R 1/2 R 1t f (Xi )−yi | ≥ 1−|h g βt )(1 − hf (Xi )) ≥ (log βt ) β βt t=1 2 t=1 t=1
  • 19. t t t=1 t=1 hf (Xi )hf (Xi ) = 0 yi = i1= 1 =0 y 5.1 R R 1 (− log βt )ht (Xi ) < (− log βt ) t=1 t=1 2 174 −1 hf (Xi ) = 1− | ht (Xi ) 5 yi | − 1 R R 2 1−|ht (Xi )−yi | βt > βt t=1 t=1 hf (Xi ) = 1 yi = 0 hf (Xi ) = yi = 1 1 R R 2 1−|ht (Xi )−yi | βt ≥ βt t=1 t=1
  • 20. 5.2  1  R R 2 1−|hf (Xi )−yi | D(i)  D(i) βt ≥ βt {i|hf (Xi )=yi } t=1 {i|hf (Xi )=yi } t=1   1 R 2 = D(i) βt {i|hf (Xi )=yi } t=1   1 R 2 R+1 wi ≥ D(i) βt {i|hf (Xi )=yi } {i|hf (Xi )=yi } t=1 1 R 2 = · βt t=1 5.2
  • 21. = · t=1 βt t=1 • 2 5.2 • 5.2 N N t+1 wi N ≥ t wi N× 2 i i=1 t+1 wi i=1≥ wi × 2 t i • : α≥0 r = {0, 1} i=1 i=1 : α≥0 r = {0, 1} αr ≤ 1 − (1 − α)r αr ≤ 1 − (1 − α)r
  • 22. 5.6. 175 N N 1−|ht (Xi )−yi | t+1 wi = t wi βt i=1 i=1 N ≤ wi (1 − (1 − βt )(1 − |ht (Xi ) − yi |)) t i=1 N N N = wi − (1 − βt ) t t wi − wi |ht (Xi ) − yi | t i=1 i=1 i=1 N N N = wi − (1 − βt ) t t wi − t t wi i=1 i=1 i=1 N N = wi − (1 − βt ) t t wi (1 − t ) i=1 i=1 N = t wi × (1 − (1 − βt )(1 − t )) i=1 βt = t /(1 − t) N = t wi ×2 t i=1
  • 23. • 5.3 t WeakLearner • t εt t WeakLearner hf hf ε R ≤ 2 t (1 − t) • 1/2 t=1 R N N N R : 5.1 βt ≤ 5.2 wi R+1 ≤ R wi ×2 t ≤ 1 wi 2 t t=1 t=1 R 1/2 i=1 N i=1 i=1 N βt wi = 1 1 ≤ R+1 wi ( 5.1 ) t=1 i=1 i=1 R N = 2 ≤ t=1 t R wi × 2 t( 5.2 ) i=1 βt = t /(1 − t ) 5.2 t = R − 1, R − 2, . . . , 1 R R −1/2 N R ≤ 2 t× βt = 12 t (1 − t) ≤ wi 2 t t=1 t=1
  • 24. 3 • • • (Weighted Voting) • • WeakLearner WeakLearner NaiveBayes K-NN, SVM • Ensemble Learning
  • 25.
  • 26.
  • 27.
  • 28. 4 5 2 A 9 3 1 C 1 8 6 7 B 2 4 5 2 A 4 4 5 2 A 8 6 7 B 3 8 6 7 B 8 6 5 9 3 1 C 9 7 9 3 1 C 8 6 7 B 4 5 2 A 9 3 1 C
  • 29. 1 1 2 3 4 5 6 7 8 9 2 4 1 2 3 4 5 6 7 9 8 3 8 6 1 2 3 4 5 6 8 9 7 5 9 7 1 2 3 4 5 7 8 9 6 1 2 3 4 6 7 8 9 5 1 2 3 5 6 7 8 9 4 1 2 4 5 6 7 8 9 3 1 3 4 5 6 7 8 9 2 2 3 4 5 6 7 8 9 1