SlideShare ist ein Scribd-Unternehmen logo
1 von 27
Downloaden Sie, um offline zu lesen
Kernel Entropy Component Analysis
      in Remote Sensing Data Clustering

Luis Gómez-Chova1          Robert Jenssen2    Gustavo Camps-Valls1

 1 Image  Processing Laboratory (IPL), Universitat de València, Spain.
     luis.gomez-chova@uv.es , http://www.valencia.edu/chovago
2 Department of Physics and Technology, University of Tromsø, Norway.
       robert.jenssen@uit.no , http://www.phys.uit.no/∼robertj


                   IGARSS 2011 – Vancouver, Canada

                  *
       IPL




    Image Processing Laboratory
Intro            ECA           KECA              Clustering       Results            Conclusions

Outline




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    1/26
Intro            ECA            KECA              Clustering       Results            Conclusions

Motivation

        Feature Extraction
             Feature selection/extraction essential before classification or regression
                  to discard redundant or noisy components
                  to reduce the dimensionality of the data
             Create a subset of new features by combinations of the existing ones

        Linear Feature Extraction
             Offer Interpretability ∼ knowledge discovery
                  PCA: projections maximizing the data set variance
                  PLS: projections maximally aligned with the labels
                  ICA: non-orthogonal projections with maximal independent axes
             Fail when data distributions are curved




            Nonlinear feature relations




L. Gómez-Chova et al.        Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    2/26
Intro            ECA               KECA              Clustering             Results           Conclusions

Objectives




        Objectives
             Kernel-based non-linear data-transformation
                     Captures the data higher order statistics
                     Extracts features suited for clustering


        Method
             Kernel Entropy Component Analysis (KECA)               [Jenssen, 2010]

             Based on Information Theory:
                     Maximally preserves entropy of the input data
                     Angular clustering maximizes cluster divergence
             Out-of-sample extension to deal with test data

        Experiments
             Cloud screening from ENVISAT/MERIS multispectral images



L. Gómez-Chova et al.           Kernel Entropy Component Analysis        IGARSS 2011 – Vancouver    3/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    4/26
Intro           ECA           KECA              Clustering       Results            Conclusions

Information-Theoretic Learning




        Entropy Concept
            Entropy of a probability density function (pdf) is a measure of information




             Entropy ⇔ Shape
                 of the pdf




L. Gómez-Chova et al.      Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    5/26
Intro           ECA             KECA              Clustering       Results            Conclusions

Information-Theoretic Learning



        Divergence Concept
            The entropy concept can be extended to obtain a measure of dissimilarity
            between distributions

                                                                 ←→




            Divergence ⇔ Distance
                 between pdfs




L. Gómez-Chova et al.        Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    6/26
Intro            ECA            KECA                   Clustering            Results           Conclusions

Entropy Component Analysis


        Shannon entropy
                                                   Z
                                     H(p) = −           p(x) log p(x)dx
            How to handle densities?                                How to compute integrals?

        Rényi’s entropies
                                                             Z
                                                1
                                    H(p) = −       log           p α (x)dx
                                               1−α

             Rényi’s entropies contain Shannon as a special case α → 1
             We focus on the Rényi’s quadratic entropy α = 2

        Rényi’s quadratic entropy
                                               Z
                             H(p) = − log          p 2 (x)dx = − log V (p)


             It can be estimated directly from samples!

L. Gómez-Chova et al.        Kernel Entropy Component Analysis            IGARSS 2011 – Vancouver    7/26
Intro            ECA               KECA              Clustering          Results              Conclusions

Entropy Component Analysis


        Rényi’s quadratic entropy estimator

             Estimated from data D = {x1 , . . . , xN } ∈ Rd generated by the pdf p(x)
             Parzen window estimator with a Gaussian or Radial Basis Function (RBF):
                        1 X                                                              2
                               K (x, xt | σ)                                                 /2σ 2
                                                                             `                       ´
              p (x) =
              ˆ                                    with      K (x, xt ) = exp − x − xt
                        N x ∈D
                            t


             Idea: Place a kernel over the samples and sum with proper normalization
             The estimator for the information potential V (p) = p 2 (x)dx
                                                                      R

                            Z              Z
                                             1 X                    1 X
                ˆ
               V (p) =         p 2 (x)dx =
                               ˆ                      K (x, xt | σ)        K (x, xt | σ)dx
                                             N x ∈D                 N x ∈D
                                                t                      t
                                           Z
                             1 X X
                       =                     K (x, xt | σ)K (x, xt | σ)dx
                            N 2 x ∈D x ∈D
                                     t    t

                                 1 X X                  √      1
                        =         2
                                            K (xt , xt | 2σ) = 2 1 K1
                                N x ∈D x ∈D                   N
                                     t    t



L. Gómez-Chova et al.           Kernel Entropy Component Analysis      IGARSS 2011 – Vancouver           8/26
Intro            ECA            KECA              Clustering        Results             Conclusions

Entropy Component Analysis

        Rényi’s quadratic entropy estimator
             Empirical Rényi entropy estimate resides in the corresponding kernel matrix

                                              ˆ       1
                                              V (p) = 2 1 K1
                                                     N
             It can be expressed in terms of eigenvalues and eigenvectors of K

                                 D diagonal matrix of eigenvalues λ1 , . . . , λN
                              
                  K = EDE
                                 E matrix with the eigenvectors       e1 , . . . , eN

             Therefore we then have
                                                 N
                                               1 X “p        ”2
                                       ˆ
                                       V (p) = 2     λi ei 1
                                              N
                                                     i =1
                            √
             where each term λi ei 1 will contribute to the entropy estimate

        ECA dimensionality reduction
             Idea: to find the smallest set of features √ maximally preserve the
                                                       that
             entropy of the input data (contributions λi ei 1)
L. Gómez-Chova et al.        Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver      9/26
Intro           ECA          KECA              Clustering         Results            Conclusions

Entropy Component Analysis



            H(p) = 4.36                    H(p) = 4.74                   H(p) = 5.05




                        H(p) = 4.71                                ˆ
                                                     H(p) = 4.81 , H(p) = 4.44




L. Gómez-Chova et al.     Kernel Entropy Component Analysis     IGARSS 2011 – Vancouver    10/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    11/26
Intro           ECA             KECA              Clustering          Results           Conclusions

Kernel Principal Component Analysis (KPCA)

        Principal Component Analysis (PCA)

            Find projections of X = [x1 , . . . , xN ]   maximizing the variance of data XU

                  PCA:         maximize:      Trace{(XU) (XU)} = Trace{U Cxx U}
                               subject to:    U U=I
            Including Lagrange multipliers λ, this is equivalent to the eigenproblem
                                       Cxx ui = λi ui → Cxx U = UD

            ui are the eigenvectors of Cxx and they are orthonormal, ui uj = 0




                                        PCA



L. Gómez-Chova et al.        Kernel Entropy Component Analysis     IGARSS 2011 – Vancouver    12/26
Intro           ECA            KECA              Clustering       Results            Conclusions

Kernel Principal Component Analysis (KPCA)


        Kernel Principal Component Analysis (KPCA)

            Find projections maximizing variance of mapped data [φ(x1 ), . . . , φ(xN )]

                   KPCA:         maximize:       Tr{(ΦU) (ΦU)} = Tr{U Φ ΦU}
                                 subject to:     U U=I

            The covariance matrix Φ Φ and projection matrix U are dH × dH !!!

        KPCA through kernel trick

            Apply the representer’s theorem: U = Φ A where A = [α1 , . . . , αN ]

                   KPCA:        maximize:       Tr{A ΦΦ ΦΦ A} = Tr{A KKA}
                                subject to:     U U = A ΦΦ A = A KA = I

            Including Lagrange multipliers λ, this is equivalent to the eigenproblem

                                    KKαi = λi Kαi → Kαi = λi αi

            Now matrix A is N × N !!! (eigendecomposition of K = EDE = AA )

L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    13/26
Intro            ECA              KECA           Clustering            Results            Conclusions

Kernel ECA Transformation



        Kernel Entropy Component Analysis (KECA)
            KECA: projection of Φ onto those m feature-space principal axes
            contributing most to the Rényi entropy estimate of the input data
                                                                 1
                                                          2
                                         Φeca = ΦUm = Em Dm

                                                                                      √
                 Projections onto a single principal axis ui in H is given by ui Φ = λi ei
                                                           1               1 Pm `√          ´2
                                                   ˆ
                 Entropy associated with Φeca is Vm = N 2 1 Keca 1 = N 2             λi ei 1
                                                                                i =1


            Note that Φeca is not necessarily based on the top eigenvalues λi since
            ei 1 also contributes to the entropy estimate

        Out-of-sample extension
            Projections for a collection of test data points:
                                                                −1               −1
                        Φeca,test = Φtest Um = Φtest ΦEm Dm 2 = Ktest Em Dm 2


L. Gómez-Chova et al.       Kernel Entropy Component Analysis        IGARSS 2011 – Vancouver     14/26
Intro           ECA          KECA              Clustering       Results            Conclusions

Kernel ECA Transformation




        KECA example

              Original             PCA                 KPCA           KECA




            KECA reveals cluster structure → underlying labels of the data
            Nonlinearly related clusters in X → different angular directions in H
            An angular clustering based on the kernel features Φeca seems reasonable




L. Gómez-Chova et al.     Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    15/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    16/26
Intro           ECA            KECA              Clustering        Results            Conclusions

KECA Spectral Clustering

        Cauchy-Schwarz divergence
            The Cauchy-Schwarz divergence between the pdf of two clusters is
                                                                  R
                                                                    pi (x)pj (x)d x
                DCS (pi , pj ) = − log(VCS (pi , pj )) = − log qR           R
                                                                  pi (x)d x pj2 (x)d x
                                                                    2



            Measuring dissimilarity in a probability space is a complex issue
                                                                              1
                                                                                  φ(xt ):
                                                                                P
            Entropy interpretation in the kernel space → mean vector µ = N
                         Z
                                          1            1
                 V (p) = p 2 (x)dx = 2 1 K1 = 2 1 ΦΦ 1 = µ µ = µ 2
                 ˆ          ˆ
                                         N            N
                                                                µi µj
            Diverg. via Parzen windowing ⇒ VCS (pi , pj ) =
                                           ˆ
                                                                µi µj
                                                                         = cos ∠(µi , µj )


        KECA Spectral Clustering
            Angular clustering of Φeca maximizes the CS divergence between clusters:
                                                   k
                                                   X
                             J(C1 , . . . , Ck ) =   Ni cos ∠(φeca (x), µi )
                                                  i =1
L. Gómez-Chova et al.       Kernel Entropy Component Analysis    IGARSS 2011 – Vancouver     17/26
Intro               ECA         KECA              Clustering       Results            Conclusions

KECA Spectral Clustering


        KECA Spectral Clustering Algorithm




         1   Obtain Φeca by Kernel ECA
         2   Initialize means µi , i = 1, . . . , k
         3   For all training samples assign a cluster
             xt → Ci maximizing cos ∠(φeca (xt ), µi )
         4   Update mean vectors µi                               CS
         5   Repeat steps 3 and 4 until convergence
                                                                          py
                                                                       tro
                                                                    En



        Intuition
        A kernel feature space data point φeca (xt ) is assigned to the cluster represented
        by the closest mean vector µi in terms of angular distance

L. Gómez-Chova et al.        Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver      18/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    19/26
Intro            ECA           KECA              Clustering       Results            Conclusions

Experimental results: Data material



        Cloud masking from ENVISAT/MERIS multispectral images
            Pixel-wise binary decisions about the presence/absence of clouds
            MERIS images taken over Spain and France
            Input samples with 13 spectral bands and 6 physically inspired features




           Barrax (BR-2003-07-14)      Barrax (BR-2004-07-14)     France (FR-2005-03-19)



L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    20/26
Intro                                           ECA                    KECA                                         Clustering                                               Results                      Conclusions

Experimental results: Numerical comparison

        Experimental setup
                                        KECA compared with k-means, KPCA + k-means, and Kernel k-means
                                        Number of clusters fixed to k = 2 (cloud-free and cloudy areas)
                                        Number of KPCA and KECA features fixed to m = 2 (stress differences)
                                        RBF-kernel width parameter is selected by gird-search for all methods

        Numerical results
                                        Validation results on 10000 pixels per image manually labeled
                                        Kappa statistic results over 10 realizations for all images
                                                 BR-2003-07-14                                                    BR-2004-07-14                                                     FR-2005-03-19
                                   1                                                                  0.8                                                              0.6


                                                                                                                                                                       0.5
                                  0.9
                                                                                                      0.7




                                                                                                                                               Estimated κ statistic
          Estimated κ statistic




                                                                              Estimated κ statistic




                                                                                                                                                                       0.4
                                  0.8
                                                                                                      0.6                                                              0.3
                                                                                                                                                                                            KECA
                                  0.7                                                                                                                                                       KPCA
                                                                                                                                                                       0.2                  Kernel k-means
                                                                                                      0.5                                                                                   k-means
                                  0.6
                                                                                                                                                                       0.1


                                  0.5                                                                 0.4                                                               0
                                          200    400    600      800   1000                                 200   400    600      800   1000                                  200   400    600      800      1000
                                                   #Samples                                                         #Samples                                                          #Samples



L. Gómez-Chova et al.                                              Kernel Entropy Component Analysis                                                         IGARSS 2011 – Vancouver                                21/26
Intro            ECA                            KECA                Clustering                Results            Conclusions

Experimental results: Numerical comparison



        Average numerical results
                                                      0.8



                                                      0.7
                              Estimated κ statistic

                                                                                                        KECA
                                                      0.6                                               KPCA
                                                                                                        Kernel k-means
                                                                                                        k-means
                                                      0.5



                                                      0.4
                                                            200   400    600     800   1000
                                                                    #Samples


             KECA outperforms k-means (+25%) and Kk-means and KPCA (+15%)
             In general, the number of training samples positively affect the results


L. Gómez-Chova et al.        Kernel Entropy Component Analysis                          IGARSS 2011 – Vancouver          22/26
Intro              ECA                  KECA                  Clustering                 Results                Conclusions

Experimental results: Classification maps

               Test Site               k-means            Kernel k-means              KPCA                  KECA
        Spain (BR-2003-07-14)    OA=96.25% ; κ=0.6112   OA=96.22% ; κ=0.7540   OA=47.52% ; κ=0.0966   OA=99.41% ; κ=0.9541




        Spain (BR-2004-07-14)    OA=96.91% ; κ=0.6018   OA=62.03% ; κ=0.0767   OA=96.66% ; κ=0.6493   OA=97.54% ; κ=0.7319




        France (FR-2005-03-19)   OA=92.87% ; κ=0.6142   OA=92.64% ; κ=0.6231   OA=80.93% ; κ=0.4051   OA=92.91% ; κ=0.6302




L. Gómez-Chova et al.               Kernel Entropy Component Analysis                 IGARSS 2011 – Vancouver                23/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    24/26
Intro            ECA           KECA              Clustering          Results            Conclusions

Conclusions and open questions




        Conclusions
            Kernel entropy component analysis for clustering remote sensing data
                 Nonlinear features preserving entropy of the input data
                 Angular clustering reveals structure in terms of clusters divergence
            Out-of-sample extension for test data → mandatory in remote sensing
            Good results on cloud screening from MERIS images
            KECA code is available at http://www.phys.uit.no/∼robertj/
            Simple feature extraction toolbox (SIMFEAT) soon at http://isp.uv.es

        Open questions and Future work
            Pre-images of transformed data in the input space
            Learn kernel parameters in an automatic way
            Test KECA in more remote sensing applications



L. Gómez-Chova et al.       Kernel Entropy Component Analysis      IGARSS 2011 – Vancouver    25/26
Intro           ECA              KECA              Clustering           Results           Conclusions




                        Kernel Entropy Component Analysis
                        in Remote Sensing Data Clustering

             Luis Gómez-Chova1            Robert Jenssen2         Gustavo Camps-Valls1

                1 Image  Processing Laboratory (IPL), Universitat de València, Spain.
                    luis.gomez-chova@uv.es , http://www.valencia.edu/chovago
             2 Department     of Physics and Technology, University of Tromsø, Norway.
                        robert.jenssen@uit.no , http://www.phys.uit.no/∼robertj


                                  IGARSS 2011 – Vancouver, Canada

                                 *
                        IPL




                  Image Processing Laboratory

L. Gómez-Chova et al.         Kernel Entropy Component Analysis      IGARSS 2011 – Vancouver    26/26

Weitere ähnliche Inhalte

Was ist angesagt?

Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodSSA KPI
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...zukun
 
Visualizing, Modeling and Forecasting of Functional Time Series
Visualizing, Modeling and Forecasting of Functional Time SeriesVisualizing, Modeling and Forecasting of Functional Time Series
Visualizing, Modeling and Forecasting of Functional Time Serieshanshang
 
Do we need a logic of quantum computation?
Do we need a logic of quantum computation?Do we need a logic of quantum computation?
Do we need a logic of quantum computation?Matthew Leifer
 
Stochastic Differentiation
Stochastic DifferentiationStochastic Differentiation
Stochastic DifferentiationSSA KPI
 
Random Matrix Theory in Array Signal Processing: Application Examples
Random Matrix Theory in Array Signal Processing: Application ExamplesRandom Matrix Theory in Array Signal Processing: Application Examples
Random Matrix Theory in Array Signal Processing: Application ExamplesFörderverein Technische Fakultät
 
Two dimensional Pool Boiling
Two dimensional Pool BoilingTwo dimensional Pool Boiling
Two dimensional Pool BoilingRobvanGils
 
Introduction to FDA and linear models
 Introduction to FDA and linear models Introduction to FDA and linear models
Introduction to FDA and linear modelstuxette
 
Lesson 22: Optimization (Section 041 slides)
Lesson 22: Optimization (Section 041 slides)Lesson 22: Optimization (Section 041 slides)
Lesson 22: Optimization (Section 041 slides)Matthew Leingang
 
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRIOriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRILeonid Zhukov
 
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...SSA KPI
 
Spectral clustering Tutorial
Spectral clustering TutorialSpectral clustering Tutorial
Spectral clustering TutorialZitao Liu
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayesPhong Vo
 
Exponentialentropyonintuitionisticfuzzysets (1)
Exponentialentropyonintuitionisticfuzzysets (1)Exponentialentropyonintuitionisticfuzzysets (1)
Exponentialentropyonintuitionisticfuzzysets (1)Dr. Hari Arora
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58Alexander Decker
 
Intro probability 4
Intro probability 4Intro probability 4
Intro probability 4Phong Vo
 

Was ist angesagt? (20)

Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo Method
 
Bertail
BertailBertail
Bertail
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
 
Visualizing, Modeling and Forecasting of Functional Time Series
Visualizing, Modeling and Forecasting of Functional Time SeriesVisualizing, Modeling and Forecasting of Functional Time Series
Visualizing, Modeling and Forecasting of Functional Time Series
 
Do we need a logic of quantum computation?
Do we need a logic of quantum computation?Do we need a logic of quantum computation?
Do we need a logic of quantum computation?
 
Stochastic Differentiation
Stochastic DifferentiationStochastic Differentiation
Stochastic Differentiation
 
Random Matrix Theory in Array Signal Processing: Application Examples
Random Matrix Theory in Array Signal Processing: Application ExamplesRandom Matrix Theory in Array Signal Processing: Application Examples
Random Matrix Theory in Array Signal Processing: Application Examples
 
6. balance laws jan 2013
6. balance laws jan 20136. balance laws jan 2013
6. balance laws jan 2013
 
3. tensor calculus jan 2013
3. tensor calculus jan 20133. tensor calculus jan 2013
3. tensor calculus jan 2013
 
CSMR11b.ppt
CSMR11b.pptCSMR11b.ppt
CSMR11b.ppt
 
Two dimensional Pool Boiling
Two dimensional Pool BoilingTwo dimensional Pool Boiling
Two dimensional Pool Boiling
 
Introduction to FDA and linear models
 Introduction to FDA and linear models Introduction to FDA and linear models
Introduction to FDA and linear models
 
Lesson 22: Optimization (Section 041 slides)
Lesson 22: Optimization (Section 041 slides)Lesson 22: Optimization (Section 041 slides)
Lesson 22: Optimization (Section 041 slides)
 
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRIOriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
 
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
 
Spectral clustering Tutorial
Spectral clustering TutorialSpectral clustering Tutorial
Spectral clustering Tutorial
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
 
Exponentialentropyonintuitionisticfuzzysets (1)
Exponentialentropyonintuitionisticfuzzysets (1)Exponentialentropyonintuitionisticfuzzysets (1)
Exponentialentropyonintuitionisticfuzzysets (1)
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58
 
Intro probability 4
Intro probability 4Intro probability 4
Intro probability 4
 

Andere mochten auch

fauvel_igarss.pdf
fauvel_igarss.pdffauvel_igarss.pdf
fauvel_igarss.pdfgrssieee
 
Nonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemNonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemMichele Filannino
 
Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...zukun
 
Different kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceDifferent kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceKhulna University
 
KPCA_Survey_Report
KPCA_Survey_ReportKPCA_Survey_Report
KPCA_Survey_ReportRandy Salm
 
Principal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionPrincipal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionJordan McBain
 
Analyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itAnalyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itMilan Rajpara
 
Adaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingAdaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingieeepondy
 
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...hanshang
 
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdfExplicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdfgrssieee
 
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...Sahidul Islam
 
Regularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataRegularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataWen-Ting Wang
 
Pca and kpca of ecg signal
Pca and kpca of ecg signalPca and kpca of ecg signal
Pca and kpca of ecg signales712
 
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleDataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleHakka Labs
 
Probabilistic PCA, EM, and more
Probabilistic PCA, EM, and moreProbabilistic PCA, EM, and more
Probabilistic PCA, EM, and morehsharmasshare
 
Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...zukun
 
Principal Component Analysis and Clustering
Principal Component Analysis and ClusteringPrincipal Component Analysis and Clustering
Principal Component Analysis and ClusteringUsha Vijay
 
ECG: Indication and Interpretation
ECG: Indication and InterpretationECG: Indication and Interpretation
ECG: Indication and InterpretationRakesh Verma
 
Introduction to Statistical Machine Learning
Introduction to Statistical Machine LearningIntroduction to Statistical Machine Learning
Introduction to Statistical Machine Learningmahutte
 

Andere mochten auch (20)

fauvel_igarss.pdf
fauvel_igarss.pdffauvel_igarss.pdf
fauvel_igarss.pdf
 
Nonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemNonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problem
 
Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...
 
Different kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceDifferent kind of distance and Statistical Distance
Different kind of distance and Statistical Distance
 
KPCA_Survey_Report
KPCA_Survey_ReportKPCA_Survey_Report
KPCA_Survey_Report
 
Principal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionPrincipal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty Detection
 
Analyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itAnalyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving it
 
Adaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingAdaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and merging
 
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
 
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdfExplicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
 
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
 
Regularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataRegularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial Data
 
Pca and kpca of ecg signal
Pca and kpca of ecg signalPca and kpca of ecg signal
Pca and kpca of ecg signal
 
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleDataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
 
Probabilistic PCA, EM, and more
Probabilistic PCA, EM, and moreProbabilistic PCA, EM, and more
Probabilistic PCA, EM, and more
 
Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...
 
Principal Component Analysis and Clustering
Principal Component Analysis and ClusteringPrincipal Component Analysis and Clustering
Principal Component Analysis and Clustering
 
ECG: Indication and Interpretation
ECG: Indication and InterpretationECG: Indication and Interpretation
ECG: Indication and Interpretation
 
Introduction to Statistical Machine Learning
Introduction to Statistical Machine LearningIntroduction to Statistical Machine Learning
Introduction to Statistical Machine Learning
 
Principal component analysis
Principal component analysisPrincipal component analysis
Principal component analysis
 

Ähnlich wie Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf

Csr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCsr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCSR2011
 
Convolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelsConvolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelstuxette
 
Integration with kernel methods, Transported meshfree methods
Integration with kernel methods, Transported meshfree methodsIntegration with kernel methods, Transported meshfree methods
Integration with kernel methods, Transported meshfree methodsMercier Jean-Marc
 
Statistical Analysis of Neural Coding
Statistical Analysis of Neural CodingStatistical Analysis of Neural Coding
Statistical Analysis of Neural CodingYifei Shea, Ph.D.
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdfgrssieee
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdfgrssieee
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdfgrssieee
 
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...grssieee
 
Topological Inference via Meshing
Topological Inference via MeshingTopological Inference via Meshing
Topological Inference via MeshingDon Sheehy
 
Neural Processes Family
Neural Processes FamilyNeural Processes Family
Neural Processes FamilyKota Matsui
 
Non-parametric regressions & Neural Networks
Non-parametric regressions & Neural NetworksNon-parametric regressions & Neural Networks
Non-parametric regressions & Neural NetworksGiuseppe Broccolo
 
A formal ontology of sequences
A formal ontology of sequencesA formal ontology of sequences
A formal ontology of sequencesRobert Hoehndorf
 
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdfKernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdfgrssieee
 
IROS 2011 talk 2 (Filippo's file)
IROS 2011 talk 2 (Filippo's file)IROS 2011 talk 2 (Filippo's file)
IROS 2011 talk 2 (Filippo's file)Gianluca Antonelli
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience hirokazutanaka
 
NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningzukun
 
Basics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingBasics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingSSA KPI
 

Ähnlich wie Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf (20)

Approximate Tree Kernels
Approximate Tree KernelsApproximate Tree Kernels
Approximate Tree Kernels
 
Csr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCsr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawal
 
Pres metabief2020jmm
Pres metabief2020jmmPres metabief2020jmm
Pres metabief2020jmm
 
Convolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelsConvolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernels
 
Integration with kernel methods, Transported meshfree methods
Integration with kernel methods, Transported meshfree methodsIntegration with kernel methods, Transported meshfree methods
Integration with kernel methods, Transported meshfree methods
 
Statistical Analysis of Neural Coding
Statistical Analysis of Neural CodingStatistical Analysis of Neural Coding
Statistical Analysis of Neural Coding
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdf
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdf
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdf
 
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
 
Astaño 4
Astaño 4Astaño 4
Astaño 4
 
Topological Inference via Meshing
Topological Inference via MeshingTopological Inference via Meshing
Topological Inference via Meshing
 
Neural Processes Family
Neural Processes FamilyNeural Processes Family
Neural Processes Family
 
Non-parametric regressions & Neural Networks
Non-parametric regressions & Neural NetworksNon-parametric regressions & Neural Networks
Non-parametric regressions & Neural Networks
 
A formal ontology of sequences
A formal ontology of sequencesA formal ontology of sequences
A formal ontology of sequences
 
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdfKernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
 
IROS 2011 talk 2 (Filippo's file)
IROS 2011 talk 2 (Filippo's file)IROS 2011 talk 2 (Filippo's file)
IROS 2011 talk 2 (Filippo's file)
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
 
NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learning
 
Basics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingBasics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programming
 

Mehr von grssieee

Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...grssieee
 
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELSEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELgrssieee
 
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...grssieee
 
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESTHE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESgrssieee
 
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSGMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSgrssieee
 
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERPROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERgrssieee
 
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
test 34mb wo animations
test  34mb wo animationstest  34mb wo animations
test 34mb wo animationsgrssieee
 
2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdfgrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
Tana_IGARSS2011.ppt
Tana_IGARSS2011.pptTana_IGARSS2011.ppt
Tana_IGARSS2011.pptgrssieee
 
Solaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptSolaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptgrssieee
 

Mehr von grssieee (20)

Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
 
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELSEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
 
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
 
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESTHE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
 
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSGMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
 
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERPROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
 
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
Test
TestTest
Test
 
test 34mb wo animations
test  34mb wo animationstest  34mb wo animations
test 34mb wo animations
 
Test 70MB
Test 70MBTest 70MB
Test 70MB
 
Test 70MB
Test 70MBTest 70MB
Test 70MB
 
2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf
 
DLR open house
DLR open houseDLR open house
DLR open house
 
DLR open house
DLR open houseDLR open house
DLR open house
 
DLR open house
DLR open houseDLR open house
DLR open house
 
Tana_IGARSS2011.ppt
Tana_IGARSS2011.pptTana_IGARSS2011.ppt
Tana_IGARSS2011.ppt
 
Solaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptSolaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.ppt
 

Kürzlich hochgeladen

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Jeffrey Haguewood
 
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu SubbuApidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbuapidays
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024The Digital Insurer
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProduct Anonymous
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?Igalia
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...apidays
 
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Zilliz
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
A Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source MilvusA Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source MilvusZilliz
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 

Kürzlich hochgeladen (20)

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu SubbuApidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
A Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source MilvusA Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source Milvus
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 

Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf

  • 1. Kernel Entropy Component Analysis in Remote Sensing Data Clustering Luis Gómez-Chova1 Robert Jenssen2 Gustavo Camps-Valls1 1 Image Processing Laboratory (IPL), Universitat de València, Spain. luis.gomez-chova@uv.es , http://www.valencia.edu/chovago 2 Department of Physics and Technology, University of Tromsø, Norway. robert.jenssen@uit.no , http://www.phys.uit.no/∼robertj IGARSS 2011 – Vancouver, Canada * IPL Image Processing Laboratory
  • 2. Intro ECA KECA Clustering Results Conclusions Outline 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 1/26
  • 3. Intro ECA KECA Clustering Results Conclusions Motivation Feature Extraction Feature selection/extraction essential before classification or regression to discard redundant or noisy components to reduce the dimensionality of the data Create a subset of new features by combinations of the existing ones Linear Feature Extraction Offer Interpretability ∼ knowledge discovery PCA: projections maximizing the data set variance PLS: projections maximally aligned with the labels ICA: non-orthogonal projections with maximal independent axes Fail when data distributions are curved Nonlinear feature relations L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 2/26
  • 4. Intro ECA KECA Clustering Results Conclusions Objectives Objectives Kernel-based non-linear data-transformation Captures the data higher order statistics Extracts features suited for clustering Method Kernel Entropy Component Analysis (KECA) [Jenssen, 2010] Based on Information Theory: Maximally preserves entropy of the input data Angular clustering maximizes cluster divergence Out-of-sample extension to deal with test data Experiments Cloud screening from ENVISAT/MERIS multispectral images L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 3/26
  • 5. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 4/26
  • 6. Intro ECA KECA Clustering Results Conclusions Information-Theoretic Learning Entropy Concept Entropy of a probability density function (pdf) is a measure of information Entropy ⇔ Shape of the pdf L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 5/26
  • 7. Intro ECA KECA Clustering Results Conclusions Information-Theoretic Learning Divergence Concept The entropy concept can be extended to obtain a measure of dissimilarity between distributions ←→ Divergence ⇔ Distance between pdfs L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 6/26
  • 8. Intro ECA KECA Clustering Results Conclusions Entropy Component Analysis Shannon entropy Z H(p) = − p(x) log p(x)dx How to handle densities? How to compute integrals? Rényi’s entropies Z 1 H(p) = − log p α (x)dx 1−α Rényi’s entropies contain Shannon as a special case α → 1 We focus on the Rényi’s quadratic entropy α = 2 Rényi’s quadratic entropy Z H(p) = − log p 2 (x)dx = − log V (p) It can be estimated directly from samples! L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 7/26
  • 9. Intro ECA KECA Clustering Results Conclusions Entropy Component Analysis Rényi’s quadratic entropy estimator Estimated from data D = {x1 , . . . , xN } ∈ Rd generated by the pdf p(x) Parzen window estimator with a Gaussian or Radial Basis Function (RBF): 1 X 2 K (x, xt | σ) /2σ 2 ` ´ p (x) = ˆ with K (x, xt ) = exp − x − xt N x ∈D t Idea: Place a kernel over the samples and sum with proper normalization The estimator for the information potential V (p) = p 2 (x)dx R Z Z 1 X 1 X ˆ V (p) = p 2 (x)dx = ˆ K (x, xt | σ) K (x, xt | σ)dx N x ∈D N x ∈D t t Z 1 X X = K (x, xt | σ)K (x, xt | σ)dx N 2 x ∈D x ∈D t t 1 X X √ 1 = 2 K (xt , xt | 2σ) = 2 1 K1 N x ∈D x ∈D N t t L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 8/26
  • 10. Intro ECA KECA Clustering Results Conclusions Entropy Component Analysis Rényi’s quadratic entropy estimator Empirical Rényi entropy estimate resides in the corresponding kernel matrix ˆ 1 V (p) = 2 1 K1 N It can be expressed in terms of eigenvalues and eigenvectors of K D diagonal matrix of eigenvalues λ1 , . . . , λN  K = EDE E matrix with the eigenvectors e1 , . . . , eN Therefore we then have N 1 X “p ”2 ˆ V (p) = 2 λi ei 1 N i =1 √ where each term λi ei 1 will contribute to the entropy estimate ECA dimensionality reduction Idea: to find the smallest set of features √ maximally preserve the that entropy of the input data (contributions λi ei 1) L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 9/26
  • 11. Intro ECA KECA Clustering Results Conclusions Entropy Component Analysis H(p) = 4.36 H(p) = 4.74 H(p) = 5.05 H(p) = 4.71 ˆ H(p) = 4.81 , H(p) = 4.44 L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 10/26
  • 12. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 11/26
  • 13. Intro ECA KECA Clustering Results Conclusions Kernel Principal Component Analysis (KPCA) Principal Component Analysis (PCA) Find projections of X = [x1 , . . . , xN ] maximizing the variance of data XU PCA: maximize: Trace{(XU) (XU)} = Trace{U Cxx U} subject to: U U=I Including Lagrange multipliers λ, this is equivalent to the eigenproblem Cxx ui = λi ui → Cxx U = UD ui are the eigenvectors of Cxx and they are orthonormal, ui uj = 0 PCA L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 12/26
  • 14. Intro ECA KECA Clustering Results Conclusions Kernel Principal Component Analysis (KPCA) Kernel Principal Component Analysis (KPCA) Find projections maximizing variance of mapped data [φ(x1 ), . . . , φ(xN )] KPCA: maximize: Tr{(ΦU) (ΦU)} = Tr{U Φ ΦU} subject to: U U=I The covariance matrix Φ Φ and projection matrix U are dH × dH !!! KPCA through kernel trick Apply the representer’s theorem: U = Φ A where A = [α1 , . . . , αN ] KPCA: maximize: Tr{A ΦΦ ΦΦ A} = Tr{A KKA} subject to: U U = A ΦΦ A = A KA = I Including Lagrange multipliers λ, this is equivalent to the eigenproblem KKαi = λi Kαi → Kαi = λi αi Now matrix A is N × N !!! (eigendecomposition of K = EDE = AA ) L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 13/26
  • 15. Intro ECA KECA Clustering Results Conclusions Kernel ECA Transformation Kernel Entropy Component Analysis (KECA) KECA: projection of Φ onto those m feature-space principal axes contributing most to the Rényi entropy estimate of the input data 1 2 Φeca = ΦUm = Em Dm √ Projections onto a single principal axis ui in H is given by ui Φ = λi ei 1 1 Pm `√ ´2 ˆ Entropy associated with Φeca is Vm = N 2 1 Keca 1 = N 2 λi ei 1 i =1 Note that Φeca is not necessarily based on the top eigenvalues λi since ei 1 also contributes to the entropy estimate Out-of-sample extension Projections for a collection of test data points: −1 −1 Φeca,test = Φtest Um = Φtest ΦEm Dm 2 = Ktest Em Dm 2 L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 14/26
  • 16. Intro ECA KECA Clustering Results Conclusions Kernel ECA Transformation KECA example Original PCA KPCA KECA KECA reveals cluster structure → underlying labels of the data Nonlinearly related clusters in X → different angular directions in H An angular clustering based on the kernel features Φeca seems reasonable L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 15/26
  • 17. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 16/26
  • 18. Intro ECA KECA Clustering Results Conclusions KECA Spectral Clustering Cauchy-Schwarz divergence The Cauchy-Schwarz divergence between the pdf of two clusters is R pi (x)pj (x)d x DCS (pi , pj ) = − log(VCS (pi , pj )) = − log qR R pi (x)d x pj2 (x)d x 2 Measuring dissimilarity in a probability space is a complex issue 1 φ(xt ): P Entropy interpretation in the kernel space → mean vector µ = N Z 1 1 V (p) = p 2 (x)dx = 2 1 K1 = 2 1 ΦΦ 1 = µ µ = µ 2 ˆ ˆ N N µi µj Diverg. via Parzen windowing ⇒ VCS (pi , pj ) = ˆ µi µj = cos ∠(µi , µj ) KECA Spectral Clustering Angular clustering of Φeca maximizes the CS divergence between clusters: k X J(C1 , . . . , Ck ) = Ni cos ∠(φeca (x), µi ) i =1 L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 17/26
  • 19. Intro ECA KECA Clustering Results Conclusions KECA Spectral Clustering KECA Spectral Clustering Algorithm 1 Obtain Φeca by Kernel ECA 2 Initialize means µi , i = 1, . . . , k 3 For all training samples assign a cluster xt → Ci maximizing cos ∠(φeca (xt ), µi ) 4 Update mean vectors µi CS 5 Repeat steps 3 and 4 until convergence py tro En Intuition A kernel feature space data point φeca (xt ) is assigned to the cluster represented by the closest mean vector µi in terms of angular distance L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 18/26
  • 20. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 19/26
  • 21. Intro ECA KECA Clustering Results Conclusions Experimental results: Data material Cloud masking from ENVISAT/MERIS multispectral images Pixel-wise binary decisions about the presence/absence of clouds MERIS images taken over Spain and France Input samples with 13 spectral bands and 6 physically inspired features Barrax (BR-2003-07-14) Barrax (BR-2004-07-14) France (FR-2005-03-19) L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 20/26
  • 22. Intro ECA KECA Clustering Results Conclusions Experimental results: Numerical comparison Experimental setup KECA compared with k-means, KPCA + k-means, and Kernel k-means Number of clusters fixed to k = 2 (cloud-free and cloudy areas) Number of KPCA and KECA features fixed to m = 2 (stress differences) RBF-kernel width parameter is selected by gird-search for all methods Numerical results Validation results on 10000 pixels per image manually labeled Kappa statistic results over 10 realizations for all images BR-2003-07-14 BR-2004-07-14 FR-2005-03-19 1 0.8 0.6 0.5 0.9 0.7 Estimated κ statistic Estimated κ statistic Estimated κ statistic 0.4 0.8 0.6 0.3 KECA 0.7 KPCA 0.2 Kernel k-means 0.5 k-means 0.6 0.1 0.5 0.4 0 200 400 600 800 1000 200 400 600 800 1000 200 400 600 800 1000 #Samples #Samples #Samples L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 21/26
  • 23. Intro ECA KECA Clustering Results Conclusions Experimental results: Numerical comparison Average numerical results 0.8 0.7 Estimated κ statistic KECA 0.6 KPCA Kernel k-means k-means 0.5 0.4 200 400 600 800 1000 #Samples KECA outperforms k-means (+25%) and Kk-means and KPCA (+15%) In general, the number of training samples positively affect the results L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 22/26
  • 24. Intro ECA KECA Clustering Results Conclusions Experimental results: Classification maps Test Site k-means Kernel k-means KPCA KECA Spain (BR-2003-07-14) OA=96.25% ; κ=0.6112 OA=96.22% ; κ=0.7540 OA=47.52% ; κ=0.0966 OA=99.41% ; κ=0.9541 Spain (BR-2004-07-14) OA=96.91% ; κ=0.6018 OA=62.03% ; κ=0.0767 OA=96.66% ; κ=0.6493 OA=97.54% ; κ=0.7319 France (FR-2005-03-19) OA=92.87% ; κ=0.6142 OA=92.64% ; κ=0.6231 OA=80.93% ; κ=0.4051 OA=92.91% ; κ=0.6302 L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 23/26
  • 25. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 24/26
  • 26. Intro ECA KECA Clustering Results Conclusions Conclusions and open questions Conclusions Kernel entropy component analysis for clustering remote sensing data Nonlinear features preserving entropy of the input data Angular clustering reveals structure in terms of clusters divergence Out-of-sample extension for test data → mandatory in remote sensing Good results on cloud screening from MERIS images KECA code is available at http://www.phys.uit.no/∼robertj/ Simple feature extraction toolbox (SIMFEAT) soon at http://isp.uv.es Open questions and Future work Pre-images of transformed data in the input space Learn kernel parameters in an automatic way Test KECA in more remote sensing applications L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 25/26
  • 27. Intro ECA KECA Clustering Results Conclusions Kernel Entropy Component Analysis in Remote Sensing Data Clustering Luis Gómez-Chova1 Robert Jenssen2 Gustavo Camps-Valls1 1 Image Processing Laboratory (IPL), Universitat de València, Spain. luis.gomez-chova@uv.es , http://www.valencia.edu/chovago 2 Department of Physics and Technology, University of Tromsø, Norway. robert.jenssen@uit.no , http://www.phys.uit.no/∼robertj IGARSS 2011 – Vancouver, Canada * IPL Image Processing Laboratory L. Gómez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 26/26