SlideShare ist ein Scribd-Unternehmen logo
1 von 12
Downloaden Sie, um offline zu lesen
4/30/2012




Linear Discriminant Analysis
 Proposed by Fisher (1936) for
 classifying an observation into one of
 two possible groups based on many
 measurements x1,x2,…xp.

 Seek a linear transformation of the
  variables Y=w1x1+w2x2+..+wpxp + a constant




                                                      1
4/30/2012




  Linear Discriminant Analysis
  Discriminant analysis – creates an
   equation which will minimize the
   possibility of misclassifying cases into their
   respective groups or categories.




The purposes of discriminant analysis (DA)
   Discriminant Function Analysis (DA)
    undertakes the same task as multiple
    linear regression by predicting an
    outcome.
   However, multiple linear regression is
    limited to cases where the dependent is
    numerical
   But many interesting variables are
    categorical,



                                                           2
4/30/2012




  The objective of DA is to perform
   dimensionality reduction while preserving as
   much of the class discriminatory information
   as possible
  Assume we have a set of D-dimensional
   samples {x 1, x2, …, xN}, N1 of which belong to
   class ω1, and N2 to class ω2.
  We seek to obtain a scalar y by projecting
   the samples x onto a line
                    y = wTx




•The top two distributions overlap too much and do not
discriminate too well compared to the bottom set.
•Misclassification will be minimal in the lower pair,
•whereas many will be misclassified in the top pair.




                                                                3
4/30/2012




 Linear Discriminant Analysis
      Assume variance matrices equal
      Classify the item x at hand to one of J groups
      based on measurements on p predictors.

      Rule: Assign x to group j that has the
      closest mean
            j = 1, 2, …, J
      Distance Measure: Mahalanobis Distance.




  Linear Discriminant Analysis
  Distance Measure:
         For j = 1, 2, …, J, compute


         d j  x    x  x j Spl  x  x j
                             T    1


Assign x to the group for which dj is minimum
S is the pooled estimate of the covariance
 pl

      matrix




                                                              4
4/30/2012




…or equivalently, assign x to the
group for which


    L x   x S
                        1      1            1
                           x
                    T                    T
       j        j       pl
                                2   x S x
                                     j       pl   j



 is a maximum.
 (Notice the linear form of the equation!)




Linear Discriminant Analysis
…optimal if….
• Multivariate normal distribution for the
  observation in each of the groups
• Equal covariance matrix for all groups
• Equal prior probability for each group
• Equal costs for misclassification




                                                             5
4/30/2012




Relaxing the assumption of equal prior
probabilities…



L x   ln p j  x S
                             T           1
                                 1                       1
                                    x
                                                  T
  j
                         j
                                 pl
                                         2   x S x
                                              j           pl   j




 pj   being the prior probability for the jth
      group.




Relaxing the assumption of equal
covariance matrices…

                    1
  Q  x   ln p j  ln S
      j                                               j
                    2
   x  x  S x  x j 
                 T   1
             j       j




result?…Quadratic Discriminant
Analysis




                                                                          6
4/30/2012




Quadratic Discriminant Analysis

    Rule: assign to group j if Q x 
                                  j
                                         is
    the largest.

Optimal if
the J groups of measurements are
multivariate normal




 Other Extensions & Related Methods
  Relaxing the assumption of normality…
       Kernel density based LDA and QDA



  Other extensions…..
       Regularized discriminant analysis
       Penalized discriminant analysis
       Flexible discriminant analysis




                                                     7
4/30/2012




Evaluations of the Methods

     Classification Table (confusion matrix)

                               Predicted group
Actual group   Number of
               observations

                               A                 B



A              nA              n11               n12
B              nB              n21               n22




Evaluations of the Methods
Apparent Error Rate (APER):
                          # misclassified
               APER =
                          Total # of cases
….underestimates the actual error rate.

Improved estimate of APER:
Holdout Method or cross validation




                                                              8
4/30/2012




Fisher's iris dataset
  •The data were collected by Anderson and used
  by Fisher to formulate the linear discriminant
  analysis (LDA or DA).
  •The dataset gives the measurements in
  centimeters of the following variables:
  1- sepal length, 2- sepal width, 3- petal length,
  and 4- petal width,
  this for 50 fowers from each of the 3 species of
  iris considered.
  •The species considered are Iris setosa,
  versicolor, and virginica




 setosa          versicolor    virginica




                                                             9
4/30/2012




An Example: Fisher’s Iris Data

Actual               Number of             Predicted Group
                     Observations
Group
                                           Setosa        Versicolo   Virginica
                                                         r
Setosa               50                    50            0           0
Versicolor           50                    0             48          2
Virginica            50                    0             1           49

Table 1: Linear Discriminant Analysis
                  (APER = 0.0200)




  An Example: Fisher’s Iris Data


Actual               Number of             Predicted Group
                     Observations
Group
                                           Setosa        Versicolo   Virginica
                                                         r
Setosa               50                    50            0           0
Versicolor           50                    0             47          3
Virginica            50                    0             1           49

Table 1: Quadratic Discriminant Analysis
                 (APER = 0.0267)




                                                                                       10
4/30/2012




                   An Example: Fisher’s Iris Data
                   2.5
                                                                                v            v
                                                        v             v             v
                                                v               v     v    v        v
                                                        v       v                                    v
                                                        v       v     v         v
                   2.0




                                           v            v       v          v                         v
                                           v        v   v
                                           v        v   v   v   v     v    c
                                                                           v
                                           v                    c
                                                    v
                                                    c           v               c   c
                   1.5




                               c
                               v           c            c
                                                        v   c   c     c    c
                                                v   c   c   c   c     c    c
Petal Width




                                   c       c        c   c   c   c
                                                c   c   c       c
                                       c   c
                   1.0




                          c    c   c   c        c   c



                                                                                        s
                   0.5




                                                                                s
                                                                                    s            s   s   s                s
                                   s                            s                   s   s            s
                                                            s   s     s    s    s   s   s    s   s   s       s        s
                                                                s     s                      s                    s

                          2.
                           0               2.
                                            5                   3.
                                                                 0                      3.
                                                                                         5                   4.
                                                                                                              0
                                                                     S Wi t
                                                                      epal dh




              An Example: Fisher’s Iris Data
                    2.5




                                                                                o
                                                                                v            o
                                                                                             v
                                                        o
                                                        v              o
                                                                       v            o
                                                                                    v
                                                o
                                                v                o
                                                                 v     o
                                                                       v   o
                                                                           v        o
                                                                                    v
                                                        o
                                                        v        o
                                                                 v                                   o
                                                                                                     v
                                                        o
                                                        v        o
                                                                 v     o
                                                                       v        o
                                                                                v
                    2.0




                                            o
                                            v           o
                                                        v        o
                                                                 v         o
                                                                           v                         o
                                                                                                     v
                                            o
                                            v       o
                                                    v   o
                                                        v
                                            o
                                            v       o
                                                    v   o
                                                        v   o
                                                            v    o
                                                                 v     o
                                                                       v   o
                                                                           c
                                                                           v
                                            o
                                            v                    x
                                                                 c
                                                    o
                                                    c            o
                                                                 v              x
                                                                                c   x
                                                                                    c
                    1.5




                               x
                               o
                               c
                               v            x
                                            c           x
                                                        c
                                                        v   x
                                                            c    x
                                                                 c     x
                                                                       c   x
                                                                           c
                                                o
                                                v   x
                                                    c   x
                                                        c   x
                                                            c    x
                                                                 c     x
                                                                       c   x
                                                                           c
     Petal Width




                                   x
                                   c        x
                                            c       x
                                                    c   x
                                                        c   x
                                                            c    x
                                                                 c
                                                x
                                                c   x
                                                    c   x
                                                        c        x
                                                                 c
                                       x
                                       c    x
                                            c
                    1.0




                           x
                           c   x
                               c   x
                                   c   x
                                       c        x
                                                c   x
                                                    c



                                                                                         +
                                                                                         s
                    0.5




                                                                                +
                                                                                s
                                                                                    +
                                                                                    s            +
                                                                                                 s   +
                                                                                                     s   +
                                                                                                         s                +
                                                                                                                          s
                                   +
                                   s                             +
                                                                 s                  +
                                                                                    s    +
                                                                                         s           +
                                                                                                     s
                                                            +
                                                            s    +
                                                                 s     +
                                                                       s   +
                                                                           s    +
                                                                                s   +
                                                                                    s    +
                                                                                         s   +
                                                                                             s   +
                                                                                                 s   +
                                                                                                     s        +
                                                                                                              s       +
                                                                                                                      s
                                                                 +
                                                                 s     +
                                                                       s                     +
                                                                                             s                    +
                                                                                                                  s

                          2.
                           0               2.
                                            5                   3.
                                                                 0                      3.
                                                                                         5                   4.
                                                                                                              0
                                                                     S Wi t
                                                                      epal dh




                                                                                                                                    11
4/30/2012




Summary
LDA is a powerful tool available for
classification.
     Widely implemented through various
     software
     Theoretical properties well
  researched




                                                12

Weitere ähnliche Inhalte

Was ist angesagt?

Dimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptxDimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptx
RohanBorgalli
 

Was ist angesagt? (20)

Dimension Reduction: What? Why? and How?
Dimension Reduction: What? Why? and How?Dimension Reduction: What? Why? and How?
Dimension Reduction: What? Why? and How?
 
Dimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptxDimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptx
 
Principal Component Analysis
Principal Component AnalysisPrincipal Component Analysis
Principal Component Analysis
 
Lect5 principal component analysis
Lect5 principal component analysisLect5 principal component analysis
Lect5 principal component analysis
 
Cluster analysis
Cluster analysisCluster analysis
Cluster analysis
 
Linear discriminant analysis
Linear discriminant analysisLinear discriminant analysis
Linear discriminant analysis
 
Cluster analysis
Cluster analysisCluster analysis
Cluster analysis
 
Data mining
Data miningData mining
Data mining
 
PCA
PCAPCA
PCA
 
Linear Discriminant Analysis (LDA)
Linear Discriminant Analysis (LDA)Linear Discriminant Analysis (LDA)
Linear Discriminant Analysis (LDA)
 
Fishers linear discriminant for dimensionality reduction.
Fishers linear discriminant for dimensionality reduction.Fishers linear discriminant for dimensionality reduction.
Fishers linear discriminant for dimensionality reduction.
 
Pca
PcaPca
Pca
 
discriminant analysis
discriminant analysisdiscriminant analysis
discriminant analysis
 
Hierarchical Clustering
Hierarchical ClusteringHierarchical Clustering
Hierarchical Clustering
 
Model selection and cross validation techniques
Model selection and cross validation techniquesModel selection and cross validation techniques
Model selection and cross validation techniques
 
Logistic regression
Logistic regressionLogistic regression
Logistic regression
 
Hierachical clustering
Hierachical clusteringHierachical clustering
Hierachical clustering
 
Multivariate analysis
Multivariate analysisMultivariate analysis
Multivariate analysis
 
Unsupervised learning (clustering)
Unsupervised learning (clustering)Unsupervised learning (clustering)
Unsupervised learning (clustering)
 
Discriminant analysis
Discriminant analysisDiscriminant analysis
Discriminant analysis
 

Ähnlich wie Discriminant Analysis-lecture 8

T Test For Two Independent Samples
T Test For Two Independent SamplesT Test For Two Independent Samples
T Test For Two Independent Samples
shoffma5
 
2012 mdsp pr02 1004
2012 mdsp pr02 10042012 mdsp pr02 1004
2012 mdsp pr02 1004
nozomuhamada
 

Ähnlich wie Discriminant Analysis-lecture 8 (20)

Abu
AbuAbu
Abu
 
Examining Differences in CES-D Measurement in The Health and Retirement Study
Examining Differences in CES-D Measurement in The Health and Retirement StudyExamining Differences in CES-D Measurement in The Health and Retirement Study
Examining Differences in CES-D Measurement in The Health and Retirement Study
 
Bayes ML.ppt
Bayes ML.pptBayes ML.ppt
Bayes ML.ppt
 
Basics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingBasics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programming
 
Symmetrical2
Symmetrical2Symmetrical2
Symmetrical2
 
Multi Level Modelling&Weights Workshop Kiel09
Multi Level Modelling&Weights Workshop Kiel09Multi Level Modelling&Weights Workshop Kiel09
Multi Level Modelling&Weights Workshop Kiel09
 
Ez25936938
Ez25936938Ez25936938
Ez25936938
 
Linear Discriminant Analysis and Its Generalization
Linear Discriminant Analysis and Its GeneralizationLinear Discriminant Analysis and Its Generalization
Linear Discriminant Analysis and Its Generalization
 
U1.4-RVDistributions.ppt
U1.4-RVDistributions.pptU1.4-RVDistributions.ppt
U1.4-RVDistributions.ppt
 
T Test For Two Independent Samples
T Test For Two Independent SamplesT Test For Two Independent Samples
T Test For Two Independent Samples
 
SimpleLinearRegressionAnalysisWithExamples.ppt
SimpleLinearRegressionAnalysisWithExamples.pptSimpleLinearRegressionAnalysisWithExamples.ppt
SimpleLinearRegressionAnalysisWithExamples.ppt
 
Linear regression.ppt
Linear regression.pptLinear regression.ppt
Linear regression.ppt
 
lecture13.ppt
lecture13.pptlecture13.ppt
lecture13.ppt
 
lecture13.ppt
lecture13.pptlecture13.ppt
lecture13.ppt
 
Slideset Simple Linear Regression models.ppt
Slideset Simple Linear Regression models.pptSlideset Simple Linear Regression models.ppt
Slideset Simple Linear Regression models.ppt
 
lecture13.ppt
lecture13.pptlecture13.ppt
lecture13.ppt
 
lecture13.ppt
lecture13.pptlecture13.ppt
lecture13.ppt
 
Classification of aedes adults mosquitoes in two distinct groups based on fis...
Classification of aedes adults mosquitoes in two distinct groups based on fis...Classification of aedes adults mosquitoes in two distinct groups based on fis...
Classification of aedes adults mosquitoes in two distinct groups based on fis...
 
2012 mdsp pr02 1004
2012 mdsp pr02 10042012 mdsp pr02 1004
2012 mdsp pr02 1004
 
Elhabian lda09
Elhabian lda09Elhabian lda09
Elhabian lda09
 

Kürzlich hochgeladen

Kürzlich hochgeladen (20)

Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptx
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 

Discriminant Analysis-lecture 8

  • 1. 4/30/2012 Linear Discriminant Analysis  Proposed by Fisher (1936) for classifying an observation into one of two possible groups based on many measurements x1,x2,…xp.  Seek a linear transformation of the variables Y=w1x1+w2x2+..+wpxp + a constant 1
  • 2. 4/30/2012 Linear Discriminant Analysis  Discriminant analysis – creates an equation which will minimize the possibility of misclassifying cases into their respective groups or categories. The purposes of discriminant analysis (DA)  Discriminant Function Analysis (DA) undertakes the same task as multiple linear regression by predicting an outcome.  However, multiple linear regression is limited to cases where the dependent is numerical  But many interesting variables are categorical, 2
  • 3. 4/30/2012  The objective of DA is to perform dimensionality reduction while preserving as much of the class discriminatory information as possible  Assume we have a set of D-dimensional samples {x 1, x2, …, xN}, N1 of which belong to class ω1, and N2 to class ω2.  We seek to obtain a scalar y by projecting the samples x onto a line y = wTx •The top two distributions overlap too much and do not discriminate too well compared to the bottom set. •Misclassification will be minimal in the lower pair, •whereas many will be misclassified in the top pair. 3
  • 4. 4/30/2012 Linear Discriminant Analysis Assume variance matrices equal Classify the item x at hand to one of J groups based on measurements on p predictors. Rule: Assign x to group j that has the closest mean j = 1, 2, …, J Distance Measure: Mahalanobis Distance. Linear Discriminant Analysis Distance Measure: For j = 1, 2, …, J, compute d j  x    x  x j Spl  x  x j T 1 Assign x to the group for which dj is minimum S is the pooled estimate of the covariance pl matrix 4
  • 5. 4/30/2012 …or equivalently, assign x to the group for which L x   x S 1 1 1 x T T j j pl 2 x S x j pl j is a maximum. (Notice the linear form of the equation!) Linear Discriminant Analysis …optimal if…. • Multivariate normal distribution for the observation in each of the groups • Equal covariance matrix for all groups • Equal prior probability for each group • Equal costs for misclassification 5
  • 6. 4/30/2012 Relaxing the assumption of equal prior probabilities… L x   ln p j  x S T 1 1 1 x T j j pl 2 x S x j pl j pj being the prior probability for the jth group. Relaxing the assumption of equal covariance matrices… 1 Q  x   ln p j  ln S j j 2  x  x  S x  x j  T 1 j j result?…Quadratic Discriminant Analysis 6
  • 7. 4/30/2012 Quadratic Discriminant Analysis Rule: assign to group j if Q x  j is the largest. Optimal if the J groups of measurements are multivariate normal Other Extensions & Related Methods Relaxing the assumption of normality… Kernel density based LDA and QDA Other extensions….. Regularized discriminant analysis Penalized discriminant analysis Flexible discriminant analysis 7
  • 8. 4/30/2012 Evaluations of the Methods Classification Table (confusion matrix) Predicted group Actual group Number of observations A B A nA n11 n12 B nB n21 n22 Evaluations of the Methods Apparent Error Rate (APER): # misclassified APER = Total # of cases ….underestimates the actual error rate. Improved estimate of APER: Holdout Method or cross validation 8
  • 9. 4/30/2012 Fisher's iris dataset •The data were collected by Anderson and used by Fisher to formulate the linear discriminant analysis (LDA or DA). •The dataset gives the measurements in centimeters of the following variables: 1- sepal length, 2- sepal width, 3- petal length, and 4- petal width, this for 50 fowers from each of the 3 species of iris considered. •The species considered are Iris setosa, versicolor, and virginica setosa versicolor virginica 9
  • 10. 4/30/2012 An Example: Fisher’s Iris Data Actual Number of Predicted Group Observations Group Setosa Versicolo Virginica r Setosa 50 50 0 0 Versicolor 50 0 48 2 Virginica 50 0 1 49 Table 1: Linear Discriminant Analysis (APER = 0.0200) An Example: Fisher’s Iris Data Actual Number of Predicted Group Observations Group Setosa Versicolo Virginica r Setosa 50 50 0 0 Versicolor 50 0 47 3 Virginica 50 0 1 49 Table 1: Quadratic Discriminant Analysis (APER = 0.0267) 10
  • 11. 4/30/2012 An Example: Fisher’s Iris Data 2.5 v v v v v v v v v v v v v v v v v 2.0 v v v v v v v v v v v v v v c v v c v c v c c 1.5 c v c c v c c c c v c c c c c c Petal Width c c c c c c c c c c c c 1.0 c c c c c c s 0.5 s s s s s s s s s s s s s s s s s s s s s s s s s s s 2. 0 2. 5 3. 0 3. 5 4. 0 S Wi t epal dh An Example: Fisher’s Iris Data 2.5 o v o v o v o v o v o v o v o v o v o v o v o v o v o v o v o v o v 2.0 o v o v o v o v o v o v o v o v o v o v o v o v o v o v o c v o v x c o c o v x c x c 1.5 x o c v x c x c v x c x c x c x c o v x c x c x c x c x c x c Petal Width x c x c x c x c x c x c x c x c x c x c x c x c 1.0 x c x c x c x c x c x c + s 0.5 + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s + s 2. 0 2. 5 3. 0 3. 5 4. 0 S Wi t epal dh 11
  • 12. 4/30/2012 Summary LDA is a powerful tool available for classification. Widely implemented through various software Theoretical properties well researched 12