SlideShare ist ein Scribd-Unternehmen logo
1 von 117
Downloaden Sie, um offline zu lesen
Bayesian Networks
Unit 4 Uncertainty Inference
        - Continuous
              Wang, Yuan-Kai, 王元凱
                     ykwang@mails.fju.edu.tw
                      http://www.ykwang.tw

    Department of Electrical Engineering, Fu Jen Univ.
                  輔仁大學電機工程系

                                2006~2011
                         Reference this document as:
           Wang, Yuan-Kai, “Uncertainty Inference - Continuous,"
      Lecture Notes of Wang, Yuan-Kai, Fu Jen University, Taiwan, 2011.
Fu Jen University   Department of Electrical Engineering   Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 2



                    Goal of this Unit
         Review basic concepts of
          statistics in terms of
           Image processing
           Pattern recognition




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 3



                      Related Units
         Previous unit(s)
           Probability Review
         Next units
           Uncertainty Inference (Discrete)
           Uncertainty Inference (Continuous)




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 4



                          Self-Study
         Artificial Intelligence: a modern
          approach
           Russell & Norvig, 2nd, Prentice Hall,
            2003. pp.462~474,
           Chapter 13, Sec. 13.1~13.3
         統計學的世界
           墨爾著,鄭惟厚譯, 天下文化,2002
         深入淺出統計學
           D. Grifiths, 楊仁和譯,2009, O’ Reilly


Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 5




                             Contents
      1.   Gaussian ……………………………..                               6
      2.   Gaussian Mixtures .......................... 36
      3.   Linear Gaussian .............................. 80
      4.   Sampling .......................................... 92
      5.   Markov Chain .................................. 102
      6.   Stochastic Process ........................ 106
      7.   Reference …………………………… 114




Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 6



             1. Gaussian Distribution
         1.1 Univariate Gaussian
         1.2 Bivariate Gaussian
         1.3 Multivariate Gaussian




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                          Unit - Uncertainty Inference (Continuous)                      p. 7



                    Why Should We Care
     Gaussians are as natural as
      Orange Juice and Sunshine
     We need them to understand
      mixture models
     We need them to understand
      Bayes Optimal Classifiers
     We need them to understand
      Bayes Network
Fu Jen University     Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 8



            1.1 Univariate Gaussian
         Univaraite Gaussian is a
          Gaussian with only one variable
                 1      x2 
        p ( x)     exp  
                        2                       E[ X ]  0 Var[ X ]  1
                 2        




                    Unit-variance Gaussian
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 9



      General Univariate Gaussian
                1       (x   )                        2
                                                                      E[ X ]  μ
      p ( x)       exp 
                                                            
                                                             
               2        2 2
                                                                      Var[ X ]   2
                                                        =15




                                                    =100
          • It is also called Normal distribution
             • Bell-shape curve
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                      p. 10



                    Normal Distribution
                                                           =15




                                                        =100
        •X~         N()
              • “X is distributed as a Gaussian with
                parameters  and 2”
        • In this figure, X ~ N(100,152)
Fu Jen University      Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 11



                       A Live Demo
          and  are two parameters of the
          Gaussian
            : Position parameter
            : Shape parameter

                                 Demo


Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                        Unit - Uncertainty Inference (Continuous)                                                    p. 12



                          Cumulative Distribution
                                Function
                                      1       x                                                                 x
               F ( x)   p ( x)dx                                                                                e    ( x   ) 2 / 2 2
                                                                                                                                                  dx
                         
                                     2                                                                     
                Density Function for the Standardized Normal Variate                          Cumulative Distribution Function for a Standardized Normal
                                                                                                                        Variate
                                    0.45
                                                                                                                            1
                                        0.4
                                                                                                                           0.9
                                    0.35
                                                                                                                           0.8
                                        0.3                                                                                0.7




                                                                            Probabilty
Density




                                    0.25                                                                                   0.6
                                                                                                                           0.5
                                        0.2
                                                                                                                           0.4
                                    0.15
                                                                                                                           0.3
                                        0.1                                                                                0.2
                                    0.05                                                                                   0.1
                                         0                                                                                  0
          -5   -4    -3    -2      -1         0    1   2     3     4    5                -5     -4     -3    -2       -1         0   1    2   3     4      5

                                Standard Deviations                                                                 Standard Deviations




Fu Jen University                                 Department of Electrical Engineering                                     Wang, Yuan-Kai Copyright
王元凱                                   Unit - Uncertainty Inference (Continuous)                      p. 13



                         The Error Function
      • Assume X ~ N(0,1)
      • Define ERF(x) = P(X<x)
        = Cumulative Distribution of X
                           x
      ERF ( x)             p( z )dz
                         z  


      1
                     x
                              z2 
    
      2            z
                       exp  2 dz
                             
                             
                                   
                                   


Fu Jen University              Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 14



           Using The Error Function
      Assume X ~ N(0,1)
      P(X<x| , 2) = ERF ( x   )
                              2




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 15



         The Central Limit Theorem
     If(X1, X2, … Xn) are i.i.d. continuous
      random variables
                                            1 n
     Then define z  f ( x1 , x2 ,...xn )   xi
                                            n i 1
     As n  , p(z)  Gaussian with
      mean E[Xi] and variance Var[Xi]




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 16



              Example –
      Zero Mean Gaussian & Noise
         Zero mean Gaussian: N(0,)
           Usually used as noise model in
            images
         An image f(x,y) with noise N(0,)
          means ?
           f(x,y) = g(x,y) + N(0,)



Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 17



              1.2 Bivariate Gaussian




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                      p. 18



                          The Formula
                               X1 
        For random vector X  
                              X  
                               2
        If X  N(, )


      p( X ) 
                         1
                                 1
                                              
                                         exp  ( X  μ)T Σ 1 ( X  μ)
                                                    1
                                                    2
                                                                                               
                    2 || Σ ||       2



                        1                          21  12 
                     μ 
                                               Σ
                                                           2 
                                                                
                        2                          21  2 
Fu Jen University      Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                            Unit - Uncertainty Inference (Continuous)                        p. 19



                Gaussian Parameters
      p( X ) 
                         1
                                 1
                                             
                                         exp  ( X  μ) Σ ( X  μ)
                                                  1
                                                  2
                                                                      T      1
                                                                                            
                    2 || Σ ||       2


       &  are Gaussian’s parameters
                       1                     21  12 
                    μ 
                                         Σ
                                              
                                                          
                       2                     21    22 
                                                          
                    : Position parameter
                    : Shape parameter
Fu Jen University       Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                             Unit - Uncertainty Inference (Continuous)                      p. 20



                    Graphical Illustration
    p(X)                                                X2
                                                                               Principal axis

           2
                                                     2


                                                                                               X1
                                                                                1
                    1

                                              X1
                             : Position parameter
                             : Shape parameter: 1, 2
Fu Jen University        Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                          Unit - Uncertainty Inference (Continuous)                      p. 21



                    General Gaussian
                       1                      21  12 
                    μ 
                                          Σ
                                               
                                                           
                       2                      21    22 
                                                           


    X2                                            X2




                                        X1
                                                                                     X1

Fu Jen University     Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                          Unit - Uncertainty Inference (Continuous)                       p. 22



              Axis-Aligned Gaussian
         X1 and X2 are independent or
          uncorrelated
                       1                       21 0 
                    μ 
                                           Σ        
                                                 0  22 
                       2                              
      X2                                                   X2



                                           X1                                           X1
                    σ1 > σ2                                        σ1 < σ2
Fu Jen University     Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                             Unit - Uncertainty Inference (Continuous)                      p. 23



                    Spherical Gaussian
                        1                       2                 0 
                     μ 
                                             Σ
                                                   0
                                                                         
                                                                        2
                        2                                           

                    X2




                                                              X1


Fu Jen University        Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 24



             Degenerated Gaussians
                1 
             μ 
                             || Σ || 0
                2

                    p( X ) 
                                       1
                                             1
                                                         
                                                     exp  1 ( X  μ )T Σ 1 ( X  μ)
                                                           2
                                                                                            
                               2 || Σ ||        2

        X2




                                            X1
Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 25



         Example – Clustering (1/4)
         Given a set of data points in a 2D
          space
         Find the Gaussian distribution of
          those points




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 26



         Example – Clustering (2/4)
        A 2D space example:
          Face verification of a person
          We use 2 features to verify the person
            Size
            Length
          We get 1000 face
           images of the person
          Each image has 2
           features: a data point
           in the 2D space
          Find the mean and range of 2 features
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 27



         Example – Clustering (3/4)




                      x and y are dependent
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 28



         Example – Clustering (4/4)




         x and y are almost                                  x and y are
            independent                                      dependent
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                  Unit - Uncertainty Inference (Continuous)                       p. 29



          1.3 Multivariate Gaussian
                              X1 
                                 
                              X2 
       For random vector X        ( X 1 , X 2 ,, X m )
                                                            T

                               
                                 
      If X  N(, )         X 
                              m
      p ( x)               m
                                1
                                              1
                                                            
                                                      exp  1 (x  μ)T Σ 1 (x  μ)
                                                            2
                                                                                                        
                    (2 )       2
                                    || Σ ||       2

                             1                     21  12                    1m 
                                                                                     
                             2                     21  2 2                   2m 
                      μ                        Σ
                                                                                 
                         
                                                                                2 
                         m                         m1  m 2                     m
Fu Jen University       Department of Electrical Engineering                      Wang, Yuan-Kai Copyright
王元凱                                  Unit - Uncertainty Inference (Continuous)                      p. 30



                Gaussian Parameters
      p ( x)               m
                                1
                                              1
                                                           
                                                      exp  (x  μ) Σ (x  μ)
                                                                 1
                                                                 2
                                                                                   T    1
                                                                                                      
                    (2 )       2
                                    || Σ ||       2



       &  are Gaussian’s parameters
                       1                    21  12                  1m 
                                                                            
                       2                    21  2 2                 2m 
                    μ                   Σ
                                                                        
                                           
                       
                       m                    m1  m 2                   2m 
                                                                               
                        : Position parameter
                        : Shape parameter
Fu Jen University        Department of Electrical Engineering                    Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                           p. 31



             Axis-Aligned Gaussians
                                 21 0               0                  0       0 
                                                                                     
                   1         0  22               0                  0       0 
                              0
                   2              0              23                 0       0  
                μ        Σ
                                                                             
                                                                                   
                                                            2 m 1
                   m          0    0               0                           0 
                                0                                               2m 
                                     0               0                   0           


      X2                                                    X2



                                            X1                                               X1
Fu Jen University      Department of Electrical Engineering                   Wang, Yuan-Kai Copyright
王元凱                               Unit - Uncertainty Inference (Continuous)                      p. 32



                    Spherical Gaussians
                       1                    2
                                              
                                                              0        0            0     0 
                                                                                             
                                             0            2                              
                       2 
                                                                       0            0     0
                    μ                                                                    
                                               0             0      2             0     0 
                                          Σ
                                                                                    
                                                                                         
                       m                     0             0        0       2         0 
                                               0                              0          2
                                                             0        0                     

                      x2



                                                 x1
Fu Jen University          Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 33



               Degenerate Gaussians
                              1 
                              
                              2                       || Σ || 0
                           μ 
                                
                              
                              
                              m
            x2



                            x1
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                   Unit - Uncertainty Inference (Continuous)                      p. 34



                    Example –
             3-Variate Gaussian (1/2)
      p ( x)               3
                                1
                                              1
                                                           
                                                      exp  (x  μ)T Σ 1 (x  μ)
                                                                 1
                                                                 2
                                                                                                      
                    (2 )       2
                                    || Σ ||       2




                         1                      21  12                        13 
                                                                                    
                    μ   2                  Σ   21  2 2                       23 
                         3                      31  32                        23 
                                                                                     


Fu Jen University           Department of Electrical Engineering                  Wang, Yuan-Kai Copyright
王元凱                                  Unit - Uncertainty Inference (Continuous)                      p. 35



                    Example –
             3-Variate Gaussian (2/2)
         Assume a simple case
           ij=0 if i≠j     21                                                    0     0 
                                                                                              
                                                              Σ 0               22      0 
                                                                 0                  0     23 
                                                                                              

       p ( x)              3
                                1
                                              1
                                                          
                                                      exp  (x  μ) Σ (x  μ)
                                                                1
                                                                2
                                                                                 T       1
                                                                                                   
                    (2 )       2
                                    || Σ ||       2


       ?


Fu Jen University       Department of Electrical Engineering                     Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 36



         2. Gaussian Mixture Model
    • What is Gaussian Mixture
          •  2 Gaussians are mixed to be a pdf
    • Why Gaussian Mixture
          • Single Gaussian is not enough
           Usually the distribution of your data
            is assumed as one Gaussian
             Also called unimodal Gaussian
           However, sometimes the
            distribution of data is not a
            unimodal Gaussian
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 37



        Why Is Unimodal Gaussian
            not Enough (1/3)
         A univariate example
           Histogram of an image




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 38


        Why Is Unimodal Gaussian
            not Enough (2/3)
         Bivariate example




    One Gaussian PDF




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                            Unit - Uncertainty Inference (Continuous)                      p. 39



        Why Is Unimodal Gaussian
            not Enough (3/3)
       To          solve it




         Mixture of Three Gaussians




Fu Jen University       Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 40



             Gaussian Mixture Model
                     (GMM)
       2.1 Combine Multiple
            Gaussians
       2.2 Formula of GMM
       2.3 Parameter Estimation
              of GMM


Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                Unit - Uncertainty Inference (Continuous)                      p. 41


                2.1 Combine Multiple
                     Gaussians
      • Unimodal Gaussian (Single Gaussian)
                             1                    1                       
                                             exp    x      x    
                                                                1
                                                              T
       p( x ) 
                     2                         2                       
                             n 2
                                   
                                       1 2


        • Multi-modal Gaussians
          (Multiple Gaussians)
                              1       1                            
                                 exp    x  1  1 1  x  1  
                                                      
                                                   T
         p( x ) 
                     2  1         2                            
                          n2  12


                           1          1                             
                                 exp    x  2   21  x  2  
                                                       
                                                    T
                
                    2   2         2                             
                          n2  12

                ...
Fu Jen University       Department of Electrical Engineering                   Wang, Yuan-Kai Copyright
王元凱                            Unit - Uncertainty Inference (Continuous)                      p. 42



         Combine 2 Gaussians (1/4)
            Suppose
              Two Gaussians in 1-dimension
               p( x)  p( x | 1 ,1)  p( x | 2 , 2 )
                                    1           x  i 2 
              p  x | i ,  i          exp              , i  1, 2
                                   2  i         2 i 
                                                       2
                                                           

                     p(x) = p(x | C1) + p(x | C2)
                              p(x|Ci)dx = 1
Fu Jen University       Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 43



                    1-D Example (2/4)
                                                      1=4, 1=0.3 1=0.6
                                                      2=6.4, 2=0.5 2=0.4
                                                                   1           ( x  4) 2
                                                      p( x )            exp(            )
                                                                2  0.3       2  0.3 2

                                                                  1           ( x  6.4) 2
                                                                       exp(               )
                                                               2  0.5         2  0.5 2


                                                      Given x=5
                                                                      1             (5  4) 2
                                                      p( x  5)             exp(             )
                                                                    2  0.3        2  0.3  2

                                                                  1            (5  6.4) 2
                                                                       exp(              )
                                                               2  0.5         2  0.52

Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                             Unit - Uncertainty Inference (Continuous)                                                        p. 44



                    Combine 2 Gaussians (3/4)
                                    2 Gaussians                                                                        Gaussian Mixture
        0.5                                                                                  0.5

       0.45                                                                                 0.45

        0.4                                                                                  0.4
                                   N(0,1)               N(3,1)                                                        N(0,1)                N(3,1)
       0.35                                                                                 0.35

        0.3                                                                                  0.3
p(x)




                                                                                     p(x)
       0.25                                                                                 0.25

        0.2                                                                                  0.2

       0.15                                                                                 0.15

        0.1                                                                                  0.1

       0.05                                                                                 0.05

         0                                                                                    0
          -4   -3   -2   -1   0       1       2     3       4      5   6   7                   -4   -3   -2   -1   0       1       2      3       4      5   6     7
                                          x                                                                                    x


               N(0,1)=p(x|0,1)                                                              p(x)=p(x|0,1)+p(x|3,1)
               N(3,1)=p(x|3,1)
Fu Jen University                                 Department of Electrical Engineering                             Wang, Yuan-Kai Copyright
王元凱                                                             Unit - Uncertainty Inference (Continuous)                                                       p. 45



                    Combine 2 Gaussians (4/4)
                                    2 Gaussians                                                                       Gaussian Mixture
        0.5                                                                                 0.5

       0.45                                                                                0.45

        0.4                                                                                 0.4
                                   N(0,1)                                                                             N(0,1)
       0.35                                                                                0.35

        0.3                                                                                 0.3




                                                                                    p(x)
p(x)




       0.25                                                                                0.25

        0.2                                              N(3,4)                            0.2                                               N(3,4)

       0.15                                                                                0.15

        0.1                                                                                 0.1

       0.05                                                                                0.05

         0                                                                                   0
          -4   -3   -2   -1   0       1       2     3       4      5   6    7                 -4   -3   -2   -1   0       1       2      3       4      5   6     7
                                          x                                                                                   x


               N(0,1)=p(x|0,1)                                                             p(x)=p(x|0,1)+p(x|3,4)
               N(3,4)=p(x|3,4)
Fu Jen University                                 Department of Electrical Engineering                            Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 46


                Combine 2 Gaussians
                 with Weights (1/3)
       p(x) = p(x | C1) + p(x | C2)
              p(x|Ci)dx = 1
        p(x)dx =  p(x|C1)dx +  p(x|C2)dx = 1 + 1 = 2

       If p(x) = ½ p(x | C1) + ½ p(x | C2)
        p(x)dx = ½  p(x|C1)dx + ½  p(x|C2)dx = 1

        If p(x) = 1 p(x | C1) + 2 p(x | C2), 1+2=1
         p(x)dx = 1 p(x|C1)dx + 2 p(x|C2)dx = 1
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                             Unit - Uncertainty Inference (Continuous)                                                        p. 47


                              Combine 2 Gaussians
                               with Weights (2/3)
                                    2 Gaussians                                                                        Gaussian Mixture
        0.5                                                                                  0.5

       0.45                                                                                 0.45

        0.4                                                                                  0.4
                                   N(0,1)               N(3,1)                                                        N(0,1)                N(3,1)
       0.35                                                                                 0.35

        0.3                                                                                  0.3
p(x)




                                                                                     p(x)
       0.25                                                                                 0.25

        0.2                                                                                  0.2

       0.15                                                                                 0.15

        0.1                                                                                  0.1

       0.05                                                                                 0.05

         0                                                                                    0
          -4   -3   -2   -1   0       1       2     3       4      5   6   7                   -4   -3   -2   -1   0       1       2      3       4      5   6     7
                                          x                                                                                    x


               N(0,1)=p(x|0,1)                                                       p(x) = ½ * p(x|0,1)
               N(3,1)=p(x|3,1)                                                            + ½ * p(x|3,1)
Fu Jen University                                 Department of Electrical Engineering                             Wang, Yuan-Kai Copyright
王元凱                                                             Unit - Uncertainty Inference (Continuous)                                                        p. 48



                              Combine 2 Gaussians
                               with Weights (3/3)
                                    2 Gaussians                                                                        Gaussian Mixture
        0.5                                                                                  0.5

       0.45                                                                                 0.45

        0.4                                                                                  0.4
                                   N(0,1)                                                                              N(0,1)
       0.35                                                                                 0.35

        0.3                                                                                  0.3
p(x)




                                                                                     p(x)
       0.25                                                                                 0.25

        0.2                                              N(3,4)                             0.2                                               N(3,4)

       0.15                                                                                 0.15

        0.1                                                                                  0.1

       0.05                                                                                 0.05

         0                                                                                    0
          -4   -3   -2   -1   0       1       2     3       4      5   6    7                  -4   -3   -2   -1   0       1       2      3       4      5   6     7
                                          x                                                                                    x


               N(0,1)=p(x|0,1)                                                              p(x) = ½ * p(x|0,1)
               N(3,4)=p(x|3,4)
                                                                                                 + ½ * p(x|3,4)
Fu Jen University                                 Department of Electrical Engineering                             Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                      p. 49

     Combine 2 Gaussians with
    Different Mean Distances (1/2)

            Suppose
              Two Gaussians in 1D
                p( x)  p( x | 1 ,1)  p( x |  2 , 2 )
                          1
                          2
                                                                  1
                                                                  2

              Let   = 1
              Let =   

Fu Jen University    Department of Electrical Engineering                 Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 50


         Combine 2 Gaussians with
        Different Mean Distances (2/2)
                                =1                                      =2




                                 =3                                     =4




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                             Unit - Uncertainty Inference (Continuous)                      p. 51


         Combine 2 Gaussians with
           Different Weights (1/2)
            Suppose
              Two Gaussians in 1D
                    p( x)  0.75 p( x | 1 ,1 )  0.25 p( x |  2 , 2 )

              Let   = 1
              Let =   


Fu Jen University        Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 52


         Combine 2 Gaussians with
           Different Weights (2/2)
                                 =1                                      =2




                                  =3                                     =4




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                      p. 53



          2D Gaussian Combination
                    (1/2)
                                   4 0                                  4 0
    p( x | C1 ), 1  (0, 0), 1      , p( x | C 2 ), 2  (0,3), 1   0 4 
                                   0 4                                       




Fu Jen University      Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                          Unit - Uncertainty Inference (Continuous)                      p. 54



          2D Gaussian Combination
                    (2/2)
    p(x)           = p(x|C1) + p(x|C2)




Fu Jen University     Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 55



                    More Gaussians
         As no. of Gaussians, M,
          increases, it can represent
          any possible density
           By adjusting M, and , ,  of
               each Gaussians
                                    M
                    p ( x )    i p ( x| C i )
                                    i 1
                                     M
                                 i p ( x|  i ,  i )
                                    i 1
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                        p. 56

           2

   p(x)   1.5              5 Gaussians
                         Component Models

           1

          0.5

           0
            -5                0                               5                        10


          0.5

          0.4                  Mixture Model
          0.3
   p(x)




          0.2

          0.1

           0
            -5                0                               5                        10
Fu Jen University                          x
                    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 57




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 58




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                          Unit - Uncertainty Inference (Continuous)                      p. 59



                    2.2 Formula of GMM
         A Gaussian mixture model
          (GMM) is a linear combination
          of M Gaussians
                                  M
                    p(x)       i 1
                                         i p(x |Ci)

        • P(x) is the probability of a point x
           •x=(Cb, Cr) or (R,G,B) or ...
        • i is mixing parameter (weight)
        • p(x|Ci) is a Gaussian function

Fu Jen University     Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                              Unit - Uncertainty Inference (Continuous)                      p. 60



             Comparison of Formula
                                            1                  1                       
 Gaussian: p( x )                                        exp    x      x    
                                                                             1
                                                                           T

                                 2                          2                       
                                            n2
                                                 
                                                     12



                                M

 GMM :              p ( x )    i p ( x| C i )
                                i 1
                      M
                                       i                  1                           
                                                    exp    x  i   i  x  i  
                                                                            1
                                                                        T

                              2                         2                           
                                       n2
                                            i
                                                 12
                      i 1


       In  GMM, p(x|Ci) means the
          probability of x in the i Gaussian
          component
Fu Jen University       Department of Electrical Engineering                 Wang, Yuan-Kai Copyright
王元凱                               Unit - Uncertainty Inference (Continuous)                      p. 61



            Two Constraints of GMM
                    M
         • i              i    1, and 0   i  1
                    i 1

         • p(x|Ci)
              • It is normalized,
              • i.e., p(x|Ci)dx = 1




Fu Jen University          Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 62



                    The Problem (1/5)
         Now we know any density can be
          obtained by Gaussian mixture
           Given the mixture function, we can
            plot its density
         But in reality, what we need to do
          in computer is
           We get a lot of data point ={xj Rn,
            j=1,…N} with unknown density
           Can we find the mixture function of
            these data points?
Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                          Unit - Uncertainty Inference (Continuous)                        p. 63
          0.5

          0.4       Histograms of ={xj Rn, j=1,…N}
                            Mixture Model
          0.3
   p(x)

          0.2

          0.1

           0
            -5                  0                               5                        10
                                                x

           2

          1.5                5 Gaussians
                           Component Models
   p(x)




           1

          0.5

        0
         -5
Fu Jen University     Department of Electrical Engineering 5
                              0                                          Wang, Yuan-Kai Copyright
                                                                                         10
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 64



                    The Problem (3/5)
         To find the mixture function
          means to estimate the parameters
          of the mixture function
           Mixing parameters 
           Gaussian component densities
                Mean vector i
                Covariance matrix i
           Number of components M
Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                      p. 65



                     The Problem (4/5)
                       No. of Parameters
                                 A Gaussian
                    1D Gaussian            2D Gaussian                    3D Gaussian
                        1                      2                              3
        ()             1                      3                              6
       Total             2                      5                              9
                       GMM with M Gaussians
                      1D GMM                  2D GMM                      3D GMM
                        M                       M                           M
                      1M                     2M                         3M
            ()       1M                     3M                         6M
           Total        3M                      6M                          10M
Fu Jen University      Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                     Unit - Uncertainty Inference (Continuous)                      p. 66



                       The Problem (5/5)
         That is, given {xj Rn, j=1,…N}
                M
          p ( x1 )       
                          i 1
                                   i    p ( x1 |  i ,  i )
                           M
          p ( x2 )       
                          i 1
                                   i   p ( x2 |  i ,  i )
                                                                        Solve i, i, i
                    ...


                          M                                            Also called
          p( xN )    i p( xN | i ,  i )                           parameter estimation
                          i 1
          Usually we use i to denote (i, i)
                                 M
               p( x)    i p( x |  i )
                                 i 1
Fu Jen University           Department of Electrical Engineering                    Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 67



           2.3 Parameter Estimation
         Given
           Fixed M
           Data ={xj Rn, j=1,…N}
             We may calculate the histogram of
              
         We want to find the parameters
           = (1, ..., M, 1, ..., M, 1, ...,M)
          that best fit the histogram of data
         Examples
           1-D example: xj R1
           Two 2-D examples: xj R2
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                       p. 68



                       1-D Example
        ={1.5, -0.2, 1.4, 1.8, ... }                      Histogram
                                                                 •Nx=-2.5=10
                                                                 •...
                                                                 •Nx=1.5=40
                                                                 •...
                                                                        Parameter
                                                                        Estimation
                                                             =1.5, =1.3
                                                                  1         (x1.5)2
                                                         p(x)         exp(         )
                                                                21.3       21.32




Fu Jen University   Department of Electrical Engineering                Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 69



                           M=7                                                M=7
                      (Izenman &                                       (Basford et al.)
                      Sommer)




                              M=3                                        M=4
                                                                   (equal variances)




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                                     Unit - Uncertainty Inference (Continuous)                       p. 70



                                                                       2-D Example
                                                                 ={(3.45,4.02), ... }
                                                                        ANEMIA PATIENTS AND CONTROLS
                                                         4.4


                                                         4.3
               Red Blood Cell Hemoglobin Concentration




                                                         4.2


                                                         4.1


                                                          4


                                                         3.9


                                                         3.8


                                                         3.7
                                                           3.3   3.4      3.5         3.6        3.7     3.8        3.9    4
Fu Jen University                                                Department     ofRed Blood CellEngineering
                                                                                   Electrical Volume                 Wang, Yuan-Kai Copyright
王元凱                                                               Unit - Uncertainty Inference (Continuous)                         p. 71
                                                                          EM ITERATION 1
                                             4.4

   Red Blood Cell Hemoglobin Concentration
                                             4.3     Initialization

                                             4.2


                                             4.1


                                              4


                                             3.9


                                             3.8


                                             3.7
                                               3.3    3.4        3.5         3.6           3.7           3.8         3.9        4
Fu Jen University                                                  Red Blood Cell Volume
                                                            Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                              Unit - Uncertainty Inference (Continuous)                         p. 72
                                                                         EM ITERATION 3
                                             4.4

   Red Blood Cell Hemoglobin Concentration
                                             4.3


                                             4.2


                                             4.1


                                              4


                                             3.9


                                             3.8


                                             3.7
                                               3.3   3.4        3.5         3.6           3.7           3.8         3.9        4
Fu Jen University                                                 Red Blood Cell Volume
                                                           Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 73




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                              Unit - Uncertainty Inference (Continuous)                         p. 74
                                                                         EM ITERATION 10
                                             4.4

   Red Blood Cell Hemoglobin Concentration
                                             4.3


                                             4.2


                                             4.1


                                              4


                                             3.9


                                             3.8


                                             3.7
                                               3.3   3.4        3.5         3.6           3.7           3.8         3.9        4
Fu Jen University                                                 Red Blood Cell Volume
                                                           Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                              Unit - Uncertainty Inference (Continuous)                         p. 75
                                                                         EM ITERATION 15
                                             4.4

   Red Blood Cell Hemoglobin Concentration
                                             4.3


                                             4.2


                                             4.1


                                              4


                                             3.9


                                             3.8


                                             3.7
                                               3.3   3.4        3.5         3.6           3.7           3.8         3.9        4
Fu Jen University                                                 Red Blood Cell Volume
                                                           Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                              Unit - Uncertainty Inference (Continuous)                         p. 76
                                                                         EM ITERATION 25
                                             4.4

   Red Blood Cell Hemoglobin Concentration
                                             4.3


                                             4.2


                                             4.1


                                              4


                                             3.9


                                             3.8


                                             3.7
                                               3.3   3.4        3.5         3.6           3.7           3.8         3.9        4
Fu Jen University                                                 Red Blood Cell Volume
                                                           Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                      Unit - Uncertainty Inference (Continuous)                        p. 77
                              LOG-LIKELIHOOD AS A FUNCTION OF EM ITERATIONS
                    490

                    480

                    470

                    460
   Log-Likelihood




                    450

                    440

                    430

                    420

                    410

                    400
                          0       5              10                  15                20            25
Fu Jen University                                EM Iteration
                                  Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                                                              Unit - Uncertainty Inference (Continuous)                         p. 78
                                                                  ANEMIA DATA WITH LABELS
                                             4.4

   Red Blood Cell Hemoglobin Concentration
                                             4.3


                                             4.2

                                                                                                         Control Group
                                             4.1


                                              4


                                             3.9                          Anemia Group


                                             3.8


                                             3.7
                                               3.3   3.4        3.5         3.6           3.7           3.8         3.9        4
Fu Jen University                                                 Red Blood Cell Volume
                                                           Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 79



    Parameter Estimation of GMM
         Two methods
           Maximum Likelihood Estimation
            (MLE)
           Expectation Maximization (EM)
         Discussed in Lecture Note 24



Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 80



                    3. Linear Gaussian
         P(X) can belong to a distribution
           Ex, Gaussian, uniform, Exponential,
            Gaussian mixture)
         P(Y|X) : conditional probability
           Can the conditional probability
            belong to a distribution?
         Linear Gaussian describes
           The distribution of conditional
            probability as Gaussian
           The dependence between random
            variables
Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 81



            From Gaussian to Linear
                Gaussian (1/2)
           If two variables x and y has linear
            relationship, we say
             y = ax + b,                                              y becomes a
             a and b are parameters                                   random variable
           If y belongs to Gaussian distribution
             y ~ N(, ) = N(y;, )    N(y)
             But y = ax+b
              ax+b ~ N(, )
             = N(ax+b, )
             But we can write
                                      -  +
                N(y; ax+b, )                     y
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                            Unit - Uncertainty Inference (Continuous)                      p. 82



        Linear Gaussian=N(y;ax+b, )
         The meaning of linear Gaussian
          N(y; ax+b, )
                                                     • When y=ax+b,
                    N(y)                               N(y) is the maximum
                                                       probability
                                                     • However, yax+b
                                                       occurs
                                                          •with lower probability,
      -3     -  + +3                    y        •decayed as in
                                                          Gaussian distribution
                    =




                    ax+b

Fu Jen University       Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                             Unit - Uncertainty Inference (Continuous)                      p. 83



                    P(y|x) = N(y;ax+b, )
           P(Xj=y|Xi=x)=P(y|x)
           If P(y|x)=N(y; ax+b, )
             Xj varies linearly with Xi
             With Gaussian uncertainty
             Standard deviation is fixed
                                                  P(Xj | Xi)
   P( X j y | X i  x)
    N ( y; ax  b, )
      1       ( y  (ax  b))2 
         exp 
                               
                                
     2             2 2
                                
                                                                                              Xi
                                                                 Xj
Fu Jen University        Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                            Unit - Uncertainty Inference (Continuous)                      p. 84



                        Example 1 (1/2)
         Illumination change of an image
               f(x,y)                                                       g(x,y)
                                        g(x,y) =
                                       af(x,y)+b

                                        g =af+b

      •Two kinds of illumination change(same a,b)
         •Real light change : g1
         •Changed by image processing software: g2
      •g1 = g2 : No
         •Because of noise, g1 is a random variable
Fu Jen University       Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 85



                    Example 1 (2/2)
         Illumination change of an image
             f(x,y)                g1(x,y)

                                     g1 =af+b




                •But g1 is a random variable
                  •It means g1 undergoes a noise
                  •g1 ~ af + b + N(0,) = N(af+b, )
                  •Or P(g1|f) = N(af+b, )
Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 86



      Extension: Linear Transform
         X and Y are two vectors
           Y = (y1, y2, …, ym)T
           X = (x1, x2, …, xm)T
         X and Y are linearly dependent
           Y = AX + B : Linear transform
         If Y becomes a random vector
           P(Y|X)=N(Y; AX+B, ∑)



Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                              p. 87



                    Example 2 (1/5)
         Illumination change of color image
               F(x,y)                                                    G(x,y)

                                  G =AF+B



      •F(x,y)=(rF(x,y), gF(x,y), bF(x,y)T
      •G(x,y)=(rG(x,y), gG(x,y), bG(x,y)T
                                                 rG   a11       a12    a13   rF   b1 
              G =AF+B                           g   a
                                                 G   21         a22    a23   g F   b2 
                                                                                 
                                                 bG   a31
                                                                a32    a33   bF  b3 
                                                                                 
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                      p. 88



                      Example 2 (2/5)
         A simple case of G=AF+B
           aij=0 if i≠j                      rG  2 0 0  rF  1
                                              g    0 3 0   g   0 
                                              G            F   
                                              bG  0 0 1  bF  0
                                                             
           RG=a11RF+b1,
            GG=a22GF+b2,
            BG=a33BF+b3
                     rG   a11       a12      a13   rF   b1 
                     g   a         a22      a23   g F   b2 
                     G   21                         
                     bG  a31
                                    a32      a33   bF  b3 
                                                       
Fu Jen University      Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 89



                    Example 2 (3/5)
         If G has noises
             F(x,y)                                                     G1(x,y)

                                  G1 =AF+B



   •G1(x,y) is a random vector
     •It means G1 undergoes a noise  11                                        12  13 
     •G1 ~ AF + B + N(0,∑)           21                                      22  23 
          = N(AF+B, ∑)                                                                   
                                       31
                                                                                32  33 
                                                                                          
     •Or P(G1|F) = N(AF+B, ∑)
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 90



                    Example 2 (4/5)
         G1 ~ N(AF+B, ∑)
           N is a multivariate Gaussian (3-D)
                                 exp 2 ( X  μ) Σ ( X  μ ) 
                         1                         1
           p( X )            1
                                       1         T

                    2 || Σ || 2

              a11rF  a12 g F  a13bF  b1   11  12  13 
             a r  a g  a b  b                        
  AF  B   21 F 22 F 23 F 2               21  22  23 
              a31rF  a32 g F  a33bF  b3 
                                             31  32  33 
                                                             


Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 91



                     Example 2 (5/5)
         Assume a simple case                              4 0                         0
       aij=0 if i≠j                      2 0 0
                                         0 3 0   0 8                                0
       ij=0 if i≠j                A                                                 
                                         0 0 1 
                                                          0 0
                                                                                        2
                                                                                          
     Then
                     a11rF  a12 g F  a13bF  b1   a11rF  b1 
          AF  B  a21rF  a22 g F  a23bF  b2   a22 g F  b2 
                                                                  
                      a31rF  a32 g F  a33bF  b3   a33bF  b3 
                                                                  
           P(rG|rF) = N(a11rF+b1, 1)
           P(gG|gF) = N(a21gF+b2, 2)
           P(bG|bF) = N(a31bF+b3, 3)

Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                            Unit - Uncertainty Inference (Continuous)                      p. 92



                            4. Sampling
         Generate N samples S from P(X)
           S=(s1,s2, …, sN)
         X can be a random variable or a
          random vector
           If X=x, si=xi
           If X=(x1,x2,…,xn), si=(x1i,x2i,…,xni)
         Why generate N samples?
           Estimate probabilities by frequencies
                                # samples with X  x
                    P( X  x) 
                                         N
Fu Jen University       Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 93



                      Example (1/2)
           A simple example: coin toss
             Tossing the coin, get head or tail
             It is a Boolean random variable
               coin = head or tail
               Random variable, but not
                 random vector
             If it is unbiased coin, head and tail
              have equal probability
               A prior probability distribution
                 P(Coin) = <0.5, 0.5>
               Uniform distribution
             But we do not know it is unbiased
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                            Unit - Uncertainty Inference (Continuous)                      p. 94



                          Example (2/2)
           Sampling in this example
            = flipping the coin many times N
             e.g., N=1000 times
             Ideally, 500 heads, 500 tails
                     P(head) = 500/1000=0.5
                      P(tail) = 500/1000=0.5
             Practically, 5001 heads, 499 tails
                     P(head) = 501/1000=0.501
                      P(tail) = 499/1000=0.499
           By the sampling, we can
            estimate the probability
            distribution
Fu Jen University       Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                             Unit - Uncertainty Inference (Continuous)                      p. 95



                        Sampling (Math)
         For a Boolean random variable X
           P(X) is prior distribution
            = <P(x), P(x)>
           Using a sampling algorithm to
            generate N samples
           Say N(x) is the number of samples
            that x is true, N(x) of x is false
            N (x)
                    Pˆ ( x ), N (  x )  P (  x )
                                           ˆ
              N                   N
                        N ( x)               N (x)
                    lim         P( x), lim          P(x)
                    N  N              N    N
Fu Jen University        Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                          Unit - Uncertainty Inference (Continuous)                      p. 96



                    Sampling Algorithm
         It is the algorithm to
           Generate samples from a known
            probability distribution
           Estimate the approximate
            probability Pˆ
         How does a sampling algorithm
          generate a sample?
           C/C++: rand()
             Return 0 ~ RAND_MAX (32767)
           Java: random()
Fu Jen University     Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 97



                A Sampling Algorithm
                   of the Coin Toss
         Flip the coin 1000 times
           int coin_face;
            for (i=0; i<1000; i++)
            { if (rand() > RAND_MAX/2)
                    coin_face = 1;
               else coin_face = 0;
            }
                       What kind of distribution?
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                      p. 98



                    Sampling Algorithms
                     for Many R.V.s (1/2)
        3 Boolean random variables X, Y, Z
          (X=1, Y=0, Z=0) is called a sample
          int X, Y, Z;
           for (i=0; i<1000; i++)
           { if (rand() > RAND_MAX/2) X = 1;
                 else X = 0;
             if (rand() > RAND_MAX/2) Y = 1;
                 else Y = 0;
             if (rand() > RAND_MAX/2) Z = 1;
                 else Z = 0;
           }
                                  X, Y, Z are all
                                            uniform distribution
Fu Jen University      Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                           Unit - Uncertainty Inference (Continuous)                      p. 99



                    Sampling Algorithms
                     for Many R.V.s (2/2)
           Y, Z are not uniform distribution
             P(Y)=<0.67, 0.33>,
              P(Z)=<0.25,0.75>
             int X, Y, Z;
              for (i=0; i<1000; i++)
              { if (rand() > RAND_MAX/2) X = 1;
                    else X = 0;
                if (rand() > RAND_MAX/3) Y = 1;
                    else Y = 0;
                if (rand() > RAND_MAX/4) Z = 1;
                    else Z = 0;
              }
Fu Jen University      Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 100



      Various Sampling Algorithms
        For more complex P(X), we need
         more complex sampling algo.
        Stochastic simulation
          Direct Sampling
          Rejection sampling
            Reject samples disagreeing with
             evidence
          Likelihood weighting
            Use evidence to weight samples
        Markov chain Monte Carlo (MCMC)
          Also called Gibbs sampling
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 101



                             Example
         Approximate reasoning for
          Bayesian networks
         TBU




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                                p. 102



                    5. Markov Chain
         Markov Assumption
           Each state at time t only
            depends on the state                                             (Andrei Andreyevich Markov)

            at time t-1
           Ex. The weather today only
            depends on the weather of
            yesterday

                                              X1                        X2                     X3
                                             t=1                       t=2                    t=3


Fu Jen University   Department of Electrical Engineering                Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 103



                    Deterministic v.s.
                    Non-Deterministic
        Deterministic patterns :
          Traffic light
          FSMs
          …

        Non-Deterministic patterns :
             Weather
             Speech
             Tracking
             …

Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                              Unit - Uncertainty Inference (Continuous)                      p. 104



                  Example –
             Weather Prediction (1/2)
         Only 3 possible weather states :
           Sunny, Cloudy, Rainy




         Transition Matrix :
           A=Pr( today | yesterday)
                                               Weather Today
                                           Sunny        Cloudy                   Rainy
              Weather       Sunny           0.5           0.25                    0.25
              Yesterday     Cloudy         0.375         0.125                   0.375
                            Rainy          0.125         0.625                   0.375
Fu Jen University         Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 105



                   Example –
              Weather Prediction (2/2)
         Suppose we know the weather of
          previous days
              t=1: rainy                     R            S           S        C
              t=2: sunny                     X1           X2          X3       X4
              t=3: sunny                    t=1          t=2          t=3     t=4
              t=4: cloudy
         Predict the weather of day t=5                                             X5
           ?                                                                       t=5




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 106



                6. Stochastic Process
         Also called Random Process
         It is a collection of random
          variables
           For each t in the index set T, X(t) is a
            random variable
           Usually t refers to time, and X(t) is
            the state of the process at time t
           X(t) can be discrete or continuous

Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 107



                    Graphical View
                of Stochastic Process




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 108



              Statistics of Stochastic
                      Process
       Mean of X(t)
       Variance, standard deviation of X(t)
       Frequency distribution of X(t) : P(X)
       Conditional probability of X(t) :
        P(X(t) | X(t-1))




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 109



                     Markov Chain
         A Markov chain is a stochastic
          process where
           P(X(t+1) | X(t), X(t-1), …, X(0))
            = P(X(t+1) | X(t)), or
           P(Xn+1 | Xn, Xn-1, …, X0)
            = P(Xn+1 | Xn)
           Next state depends only on current
            state
           Future and past are conditionally
            independent given current

Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 110



        Higher-order Markov Chain
         Second-order Markov chain
           P(X(t+1) | X(t), X(t-1), …, X(0))
            = P(X(t+1) | X(t), X(t-1))


            X1              X2                         X                       X4
                                                       3
           t=1              t=2                        t=3                     t=4

         Third-order, n-order

Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                         Unit - Uncertainty Inference (Continuous)                      p. 111



                    Stationary Process
         The probability distribution of X is
          independent of t
             X1               X2                         X                       X4
                                                         3
            t=1               t=2                        t=3                     t=4




Fu Jen University    Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                             Unit - Uncertainty Inference (Continuous)                      p. 112



        Doubly Stochastic Process
         Hidden variable

                    X1                    X2                     X3




                    Y1                   Y2                     Y3




Fu Jen University        Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 113



                    Example - Video
         Facial expression recognition
         TBU




Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 114



                       7. Reference
         Moments
           Hu, M. K., [1962]. “Visual Pattern Recognition by
            Moment Invariants.” IRE Trans. Info. Theory,
            vol. IT-8, pp. 179-187.
           Gonzalez, R.C. and R. E. Woods. [2001]. Digital
            Image Processing, 2nd, Prentice Hall, pp.659-
            660, 672-675.
           L. Rocha, et. al., “Image Moments-Based
            Structuring and Tracking of Objects,”
            Proceedings of the XV Brazilian Symposium on
            Computer Graphics and Image Processing,
            2002.



Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 115



                           Reference
         Gaussian mixture
           C. M. Bishop. Neural Networks for Pattern
            Recognition. Oxford University Press, 1995.
           McLachlan and Peel, Finite Mixture Models,
            John Wiley & Sons,
           Rennie, “A Short Tutorial on Using EM with
            Mixture Models,” MIT Tech. Report, 2004.
            http://www.ai.mit.edu/
            people/jrennie/writing/mixtureEM.pdf.
           Tomasi, “Estimating Gaussian Mixture Density
            with EM: a tutorial,” Duke University,
           Y. Weiss. Motion segmentation using EM – a
            short tutorial. 1997.
            http://www.cs.huji.ac.il/˜yweiss/emTutorial.pdf

Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
王元凱                        Unit - Uncertainty Inference (Continuous)                      p. 116



                           Reference
         Linear Gaussian
           Russell & Norvig, Artificial Intelligence: a
            modern approach, 2nd, Prentice Hall, 2003.
            Sec. 14.3, pp.501-503
         Sampling
           Russell&Norvig, Artificial Intelligence: a modern
            approach, 2nd, Prentice Hall, 2003.
            Sec. 14.5, pp.511-518
         Stochastic process
           Probability, Random variables and Random
            Signal Principles, 4e, Peebles, McGraw Hill,
            2001
Fu Jen University   Department of Electrical Engineering               Wang, Yuan-Kai Copyright
04 Uncertainty inference(continuous)

Weitere ähnliche Inhalte

Was ist angesagt?

Tensor representations in signal processing and machine learning (tutorial ta...
Tensor representations in signal processing and machine learning (tutorial ta...Tensor representations in signal processing and machine learning (tutorial ta...
Tensor representations in signal processing and machine learning (tutorial ta...Tatsuya Yokota
 
Principal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationPrincipal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationTatsuya Yokota
 
Biased normalized cuts
Biased normalized cutsBiased normalized cuts
Biased normalized cutsirisshicat
 
Lesson 12: Linear Approximation (Section 41 handout)
Lesson 12: Linear Approximation (Section 41 handout)Lesson 12: Linear Approximation (Section 41 handout)
Lesson 12: Linear Approximation (Section 41 handout)Matthew Leingang
 
An efficient approach to wavelet image Denoising
An efficient approach to wavelet image DenoisingAn efficient approach to wavelet image Denoising
An efficient approach to wavelet image Denoisingijcsit
 
Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...
Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...
Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...Tatsuya Yokota
 
11.solution of a singular class of boundary value problems by variation itera...
11.solution of a singular class of boundary value problems by variation itera...11.solution of a singular class of boundary value problems by variation itera...
11.solution of a singular class of boundary value problems by variation itera...Alexander Decker
 
Lesson 12: Linear Approximation and Differentials (Section 21 handout)
Lesson 12: Linear Approximation and Differentials (Section 21 handout)Lesson 12: Linear Approximation and Differentials (Section 21 handout)
Lesson 12: Linear Approximation and Differentials (Section 21 handout)Matthew Leingang
 
Solution of a singular class of boundary value problems by variation iteratio...
Solution of a singular class of boundary value problems by variation iteratio...Solution of a singular class of boundary value problems by variation iteratio...
Solution of a singular class of boundary value problems by variation iteratio...Alexander Decker
 
On Continuous Approximate Solution of Ordinary Differential Equations
On Continuous Approximate Solution of Ordinary Differential EquationsOn Continuous Approximate Solution of Ordinary Differential Equations
On Continuous Approximate Solution of Ordinary Differential EquationsWaqas Tariq
 
Computing Loops
Computing LoopsComputing Loops
Computing LoopsAntonini
 
Lesson 26: Integration by Substitution (slides)
Lesson 26: Integration by Substitution (slides)Lesson 26: Integration by Substitution (slides)
Lesson 26: Integration by Substitution (slides)Matthew Leingang
 
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron modelsJAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron modelshirokazutanaka
 
Lesson 13: Exponential and Logarithmic Functions (Section 021 handout)
Lesson 13: Exponential and Logarithmic Functions (Section 021 handout)Lesson 13: Exponential and Logarithmic Functions (Section 021 handout)
Lesson 13: Exponential and Logarithmic Functions (Section 021 handout)Matthew Leingang
 
C:\Documents And Settings\Administrator\سطح المكتب\Cbse\[P
C:\Documents And Settings\Administrator\سطح المكتب\Cbse\[PC:\Documents And Settings\Administrator\سطح المكتب\Cbse\[P
C:\Documents And Settings\Administrator\سطح المكتب\Cbse\[Pguest9168dc
 
11.final paper 0040www.iiste.org call-for_paper-46
11.final paper  0040www.iiste.org call-for_paper-4611.final paper  0040www.iiste.org call-for_paper-46
11.final paper 0040www.iiste.org call-for_paper-46Alexander Decker
 
Electromagnetic Scattering from Objects with Thin Coatings.2016.05.04.02
Electromagnetic Scattering from Objects with Thin Coatings.2016.05.04.02Electromagnetic Scattering from Objects with Thin Coatings.2016.05.04.02
Electromagnetic Scattering from Objects with Thin Coatings.2016.05.04.02Luke Underwood
 

Was ist angesagt? (20)

11
1111
11
 
Tensor representations in signal processing and machine learning (tutorial ta...
Tensor representations in signal processing and machine learning (tutorial ta...Tensor representations in signal processing and machine learning (tutorial ta...
Tensor representations in signal processing and machine learning (tutorial ta...
 
Principal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationPrincipal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classification
 
Biased normalized cuts
Biased normalized cutsBiased normalized cuts
Biased normalized cuts
 
Lesson 12: Linear Approximation (Section 41 handout)
Lesson 12: Linear Approximation (Section 41 handout)Lesson 12: Linear Approximation (Section 41 handout)
Lesson 12: Linear Approximation (Section 41 handout)
 
An efficient approach to wavelet image Denoising
An efficient approach to wavelet image DenoisingAn efficient approach to wavelet image Denoising
An efficient approach to wavelet image Denoising
 
Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...
Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...
Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...
 
11.solution of a singular class of boundary value problems by variation itera...
11.solution of a singular class of boundary value problems by variation itera...11.solution of a singular class of boundary value problems by variation itera...
11.solution of a singular class of boundary value problems by variation itera...
 
Lesson 12: Linear Approximation and Differentials (Section 21 handout)
Lesson 12: Linear Approximation and Differentials (Section 21 handout)Lesson 12: Linear Approximation and Differentials (Section 21 handout)
Lesson 12: Linear Approximation and Differentials (Section 21 handout)
 
Solution of a singular class of boundary value problems by variation iteratio...
Solution of a singular class of boundary value problems by variation iteratio...Solution of a singular class of boundary value problems by variation iteratio...
Solution of a singular class of boundary value problems by variation iteratio...
 
Phy xii
Phy xiiPhy xii
Phy xii
 
On Continuous Approximate Solution of Ordinary Differential Equations
On Continuous Approximate Solution of Ordinary Differential EquationsOn Continuous Approximate Solution of Ordinary Differential Equations
On Continuous Approximate Solution of Ordinary Differential Equations
 
Fuzzy and nn
Fuzzy and nnFuzzy and nn
Fuzzy and nn
 
Computing Loops
Computing LoopsComputing Loops
Computing Loops
 
Lesson 26: Integration by Substitution (slides)
Lesson 26: Integration by Substitution (slides)Lesson 26: Integration by Substitution (slides)
Lesson 26: Integration by Substitution (slides)
 
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron modelsJAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
 
Lesson 13: Exponential and Logarithmic Functions (Section 021 handout)
Lesson 13: Exponential and Logarithmic Functions (Section 021 handout)Lesson 13: Exponential and Logarithmic Functions (Section 021 handout)
Lesson 13: Exponential and Logarithmic Functions (Section 021 handout)
 
C:\Documents And Settings\Administrator\سطح المكتب\Cbse\[P
C:\Documents And Settings\Administrator\سطح المكتب\Cbse\[PC:\Documents And Settings\Administrator\سطح المكتب\Cbse\[P
C:\Documents And Settings\Administrator\سطح المكتب\Cbse\[P
 
11.final paper 0040www.iiste.org call-for_paper-46
11.final paper  0040www.iiste.org call-for_paper-4611.final paper  0040www.iiste.org call-for_paper-46
11.final paper 0040www.iiste.org call-for_paper-46
 
Electromagnetic Scattering from Objects with Thin Coatings.2016.05.04.02
Electromagnetic Scattering from Objects with Thin Coatings.2016.05.04.02Electromagnetic Scattering from Objects with Thin Coatings.2016.05.04.02
Electromagnetic Scattering from Objects with Thin Coatings.2016.05.04.02
 

Andere mochten auch (7)

Introduction Pp 1
Introduction Pp 1Introduction Pp 1
Introduction Pp 1
 
Introduction Pp 1
Introduction Pp 1Introduction Pp 1
Introduction Pp 1
 
Nac tech test benefits presentation
Nac tech test benefits presentationNac tech test benefits presentation
Nac tech test benefits presentation
 
Towards Embedded Computer Vision - New @ 2013
Towards Embedded Computer Vision - New @ 2013Towards Embedded Computer Vision - New @ 2013
Towards Embedded Computer Vision - New @ 2013
 
老師與教學助理的互動經驗分享 1010217
老師與教學助理的互動經驗分享 1010217老師與教學助理的互動經驗分享 1010217
老師與教學助理的互動經驗分享 1010217
 
From engneering to edgineering
From engneering to edgineeringFrom engneering to edgineering
From engneering to edgineering
 
Towards Embedded Computer Vision邁向嵌入式電腦視覺
Towards Embedded Computer Vision邁向嵌入式電腦視覺Towards Embedded Computer Vision邁向嵌入式電腦視覺
Towards Embedded Computer Vision邁向嵌入式電腦視覺
 

Ähnlich wie 04 Uncertainty inference(continuous)

(研究会輪読) Facial Landmark Detection by Deep Multi-task Learning
(研究会輪読) Facial Landmark Detection by Deep Multi-task Learning(研究会輪読) Facial Landmark Detection by Deep Multi-task Learning
(研究会輪読) Facial Landmark Detection by Deep Multi-task LearningMasahiro Suzuki
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...ijceronline
 
Vibration energy harvesting under uncertainty
Vibration energy harvesting under uncertaintyVibration energy harvesting under uncertainty
Vibration energy harvesting under uncertaintyUniversity of Glasgow
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58Alexander Decker
 
generalformulationofFiniteelementofmodel
generalformulationofFiniteelementofmodelgeneralformulationofFiniteelementofmodel
generalformulationofFiniteelementofmodelPiyushDhuri1
 
generalformulation.ppt
generalformulation.pptgeneralformulation.ppt
generalformulation.pptRajuRaju183149
 

Ähnlich wie 04 Uncertainty inference(continuous) (14)

03 Uncertainty inference(discrete)
03 Uncertainty inference(discrete)03 Uncertainty inference(discrete)
03 Uncertainty inference(discrete)
 
02 Statistics review
02 Statistics review02 Statistics review
02 Statistics review
 
01 Probability review
01 Probability review01 Probability review
01 Probability review
 
07 approximate inference in bn
07 approximate inference in bn07 approximate inference in bn
07 approximate inference in bn
 
06 exact inference in bn
06 exact inference in bn06 exact inference in bn
06 exact inference in bn
 
(研究会輪読) Facial Landmark Detection by Deep Multi-task Learning
(研究会輪読) Facial Landmark Detection by Deep Multi-task Learning(研究会輪読) Facial Landmark Detection by Deep Multi-task Learning
(研究会輪読) Facial Landmark Detection by Deep Multi-task Learning
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
 
Havlin
Havlin Havlin
Havlin
 
Vibration energy harvesting under uncertainty
Vibration energy harvesting under uncertaintyVibration energy harvesting under uncertainty
Vibration energy harvesting under uncertainty
 
V4502136139
V4502136139V4502136139
V4502136139
 
Q.M.pptx
Q.M.pptxQ.M.pptx
Q.M.pptx
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58
 
generalformulationofFiniteelementofmodel
generalformulationofFiniteelementofmodelgeneralformulationofFiniteelementofmodel
generalformulationofFiniteelementofmodel
 
generalformulation.ppt
generalformulation.pptgeneralformulation.ppt
generalformulation.ppt
 

Mehr von IEEE International Conference on Intelligent Information Hiding and Multimedia Signal Processing

Mehr von IEEE International Conference on Intelligent Information Hiding and Multimedia Signal Processing (9)

Computer Vision in the Age of IoT
Computer Vision in the Age of IoTComputer Vision in the Age of IoT
Computer Vision in the Age of IoT
 
2014/07/17 Parallelize computer vision by GPGPU computing
2014/07/17 Parallelize computer vision by GPGPU computing2014/07/17 Parallelize computer vision by GPGPU computing
2014/07/17 Parallelize computer vision by GPGPU computing
 
Parallel Vision by GPGPU/CUDA
Parallel Vision by GPGPU/CUDAParallel Vision by GPGPU/CUDA
Parallel Vision by GPGPU/CUDA
 
Markov Random Field (MRF)
Markov Random Field (MRF)Markov Random Field (MRF)
Markov Random Field (MRF)
 
08 probabilistic inference over time
08 probabilistic inference over time08 probabilistic inference over time
08 probabilistic inference over time
 
05 probabilistic graphical models
05 probabilistic graphical models05 probabilistic graphical models
05 probabilistic graphical models
 
Monocular Human Pose Estimation with Bayesian Networks
Monocular Human Pose Estimation with Bayesian NetworksMonocular Human Pose Estimation with Bayesian Networks
Monocular Human Pose Estimation with Bayesian Networks
 
Intelligent Video Surveillance with Cloud Computing
Intelligent Video Surveillance with Cloud ComputingIntelligent Video Surveillance with Cloud Computing
Intelligent Video Surveillance with Cloud Computing
 
Intelligent Video Surveillance and Sousveillance
Intelligent Video Surveillance and SousveillanceIntelligent Video Surveillance and Sousveillance
Intelligent Video Surveillance and Sousveillance
 

Kürzlich hochgeladen

Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 

Kürzlich hochgeladen (20)

Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 

04 Uncertainty inference(continuous)

  • 1. Bayesian Networks Unit 4 Uncertainty Inference - Continuous Wang, Yuan-Kai, 王元凱 ykwang@mails.fju.edu.tw http://www.ykwang.tw Department of Electrical Engineering, Fu Jen Univ. 輔仁大學電機工程系 2006~2011 Reference this document as: Wang, Yuan-Kai, “Uncertainty Inference - Continuous," Lecture Notes of Wang, Yuan-Kai, Fu Jen University, Taiwan, 2011. Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 2. 王元凱 Unit - Uncertainty Inference (Continuous) p. 2 Goal of this Unit  Review basic concepts of statistics in terms of  Image processing  Pattern recognition Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 3. 王元凱 Unit - Uncertainty Inference (Continuous) p. 3 Related Units  Previous unit(s)  Probability Review  Next units  Uncertainty Inference (Discrete)  Uncertainty Inference (Continuous) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 4. 王元凱 Unit - Uncertainty Inference (Continuous) p. 4 Self-Study  Artificial Intelligence: a modern approach  Russell & Norvig, 2nd, Prentice Hall, 2003. pp.462~474,  Chapter 13, Sec. 13.1~13.3  統計學的世界  墨爾著,鄭惟厚譯, 天下文化,2002  深入淺出統計學  D. Grifiths, 楊仁和譯,2009, O’ Reilly Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 5. 王元凱 Unit - Uncertainty Inference (Continuous) p. 5 Contents 1. Gaussian …………………………….. 6 2. Gaussian Mixtures .......................... 36 3. Linear Gaussian .............................. 80 4. Sampling .......................................... 92 5. Markov Chain .................................. 102 6. Stochastic Process ........................ 106 7. Reference …………………………… 114 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 6. 王元凱 Unit - Uncertainty Inference (Continuous) p. 6 1. Gaussian Distribution  1.1 Univariate Gaussian  1.2 Bivariate Gaussian  1.3 Multivariate Gaussian Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 7. 王元凱 Unit - Uncertainty Inference (Continuous) p. 7 Why Should We Care  Gaussians are as natural as Orange Juice and Sunshine  We need them to understand mixture models  We need them to understand Bayes Optimal Classifiers  We need them to understand Bayes Network Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 8. 王元凱 Unit - Uncertainty Inference (Continuous) p. 8 1.1 Univariate Gaussian  Univaraite Gaussian is a Gaussian with only one variable 1  x2  p ( x)  exp    2 E[ X ]  0 Var[ X ]  1 2   Unit-variance Gaussian Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 9. 王元凱 Unit - Uncertainty Inference (Continuous) p. 9 General Univariate Gaussian 1  (x   ) 2  E[ X ]  μ p ( x)  exp     2   2 2  Var[ X ]   2 =15 =100 • It is also called Normal distribution • Bell-shape curve Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 10. 王元凱 Unit - Uncertainty Inference (Continuous) p. 10 Normal Distribution =15 =100 •X~ N() • “X is distributed as a Gaussian with parameters  and 2” • In this figure, X ~ N(100,152) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 11. 王元凱 Unit - Uncertainty Inference (Continuous) p. 11 A Live Demo   and  are two parameters of the Gaussian   : Position parameter   : Shape parameter Demo Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 12. 王元凱 Unit - Uncertainty Inference (Continuous) p. 12 Cumulative Distribution Function 1 x x F ( x)   p ( x)dx   e  ( x   ) 2 / 2 2 dx  2   Density Function for the Standardized Normal Variate Cumulative Distribution Function for a Standardized Normal Variate 0.45 1 0.4 0.9 0.35 0.8 0.3 0.7 Probabilty Density 0.25 0.6 0.5 0.2 0.4 0.15 0.3 0.1 0.2 0.05 0.1 0 0 -5 -4 -3 -2 -1 0 1 2 3 4 5 -5 -4 -3 -2 -1 0 1 2 3 4 5 Standard Deviations Standard Deviations Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 13. 王元凱 Unit - Uncertainty Inference (Continuous) p. 13 The Error Function • Assume X ~ N(0,1) • Define ERF(x) = P(X<x) = Cumulative Distribution of X x ERF ( x)   p( z )dz z   1 x  z2   2 z  exp  2 dz     Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 14. 王元凱 Unit - Uncertainty Inference (Continuous) p. 14 Using The Error Function Assume X ~ N(0,1) P(X<x| , 2) = ERF ( x   ) 2 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 15. 王元凱 Unit - Uncertainty Inference (Continuous) p. 15 The Central Limit Theorem  If(X1, X2, … Xn) are i.i.d. continuous random variables 1 n  Then define z  f ( x1 , x2 ,...xn )   xi n i 1  As n  , p(z)  Gaussian with mean E[Xi] and variance Var[Xi] Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 16. 王元凱 Unit - Uncertainty Inference (Continuous) p. 16 Example – Zero Mean Gaussian & Noise  Zero mean Gaussian: N(0,)  Usually used as noise model in images  An image f(x,y) with noise N(0,) means ?  f(x,y) = g(x,y) + N(0,) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 17. 王元凱 Unit - Uncertainty Inference (Continuous) p. 17 1.2 Bivariate Gaussian Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 18. 王元凱 Unit - Uncertainty Inference (Continuous) p. 18 The Formula  X1  For random vector X   X    2 If X  N(, ) p( X )  1 1  exp  ( X  μ)T Σ 1 ( X  μ) 1 2  2 || Σ || 2  1    21  12  μ    Σ  2    2  21  2  Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 19. 王元凱 Unit - Uncertainty Inference (Continuous) p. 19 Gaussian Parameters p( X )  1 1  exp  ( X  μ) Σ ( X  μ) 1 2 T 1  2 || Σ || 2  &  are Gaussian’s parameters  1    21  12  μ    Σ    2  21  22     : Position parameter   : Shape parameter Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 20. 王元凱 Unit - Uncertainty Inference (Continuous) p. 20 Graphical Illustration p(X) X2 Principal axis 2 2 X1 1 1 X1   : Position parameter   : Shape parameter: 1, 2 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 21. 王元凱 Unit - Uncertainty Inference (Continuous) p. 21 General Gaussian  1    21  12  μ    Σ    2  21  22   X2 X2 X1 X1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 22. 王元凱 Unit - Uncertainty Inference (Continuous) p. 22 Axis-Aligned Gaussian  X1 and X2 are independent or uncorrelated  1    21 0  μ    Σ   0  22   2   X2 X2 X1 X1 σ1 > σ2 σ1 < σ2 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 23. 王元凱 Unit - Uncertainty Inference (Continuous) p. 23 Spherical Gaussian  1   2 0  μ    Σ  0  2  2    X2 X1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 24. 王元凱 Unit - Uncertainty Inference (Continuous) p. 24 Degenerated Gaussians  1  μ    || Σ || 0  2 p( X )  1 1  exp  1 ( X  μ )T Σ 1 ( X  μ) 2  2 || Σ || 2 X2 X1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 25. 王元凱 Unit - Uncertainty Inference (Continuous) p. 25 Example – Clustering (1/4)  Given a set of data points in a 2D space  Find the Gaussian distribution of those points Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 26. 王元凱 Unit - Uncertainty Inference (Continuous) p. 26 Example – Clustering (2/4)  A 2D space example:  Face verification of a person  We use 2 features to verify the person  Size  Length  We get 1000 face images of the person  Each image has 2 features: a data point in the 2D space  Find the mean and range of 2 features Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 27. 王元凱 Unit - Uncertainty Inference (Continuous) p. 27 Example – Clustering (3/4) x and y are dependent Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 28. 王元凱 Unit - Uncertainty Inference (Continuous) p. 28 Example – Clustering (4/4) x and y are almost x and y are independent dependent Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 29. 王元凱 Unit - Uncertainty Inference (Continuous) p. 29 1.3 Multivariate Gaussian  X1     X2  For random vector X     ( X 1 , X 2 ,, X m ) T    If X  N(, ) X   m p ( x)  m 1 1  exp  1 (x  μ)T Σ 1 (x  μ) 2  (2 ) 2 || Σ || 2  1    21  12   1m       2    21  2 2   2m  μ  Σ             2   m  m1  m 2   m Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 30. 王元凱 Unit - Uncertainty Inference (Continuous) p. 30 Gaussian Parameters p ( x)  m 1 1  exp  (x  μ) Σ (x  μ) 1 2 T 1  (2 ) 2 || Σ || 2  &  are Gaussian’s parameters  1    21  12   1m       2    21  2 2   2m  μ  Σ              m  m1  m 2   2m     : Position parameter   : Shape parameter Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 31. 王元凱 Unit - Uncertainty Inference (Continuous) p. 31 Axis-Aligned Gaussians   21 0 0  0 0     1   0  22 0  0 0     0  2   0  23  0 0   μ  Σ                  2 m 1  m  0 0 0 0   0   2m   0 0 0  X2 X2 X1 X1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 32. 王元凱 Unit - Uncertainty Inference (Continuous) p. 32 Spherical Gaussians  1   2  0 0  0 0      0 2   2  0  0 0 μ     0 0 2  0 0   Σ                m  0 0 0  2 0   0  0  2  0 0  x2 x1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 33. 王元凱 Unit - Uncertainty Inference (Continuous) p. 33 Degenerate Gaussians  1     2  || Σ || 0 μ        m x2 x1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 34. 王元凱 Unit - Uncertainty Inference (Continuous) p. 34 Example – 3-Variate Gaussian (1/2) p ( x)  3 1 1  exp  (x  μ)T Σ 1 (x  μ) 1 2  (2 ) 2 || Σ || 2  1   21  12  13      μ   2 Σ   21  2 2  23   3   31  32  23      Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 35. 王元凱 Unit - Uncertainty Inference (Continuous) p. 35 Example – 3-Variate Gaussian (2/2)  Assume a simple case  ij=0 if i≠j  21 0 0    Σ 0  22 0   0 0  23    p ( x)  3 1 1  exp  (x  μ) Σ (x  μ) 1 2 T 1  (2 ) 2 || Σ || 2 ? Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 36. 王元凱 Unit - Uncertainty Inference (Continuous) p. 36 2. Gaussian Mixture Model • What is Gaussian Mixture •  2 Gaussians are mixed to be a pdf • Why Gaussian Mixture • Single Gaussian is not enough  Usually the distribution of your data is assumed as one Gaussian  Also called unimodal Gaussian  However, sometimes the distribution of data is not a unimodal Gaussian Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 37. 王元凱 Unit - Uncertainty Inference (Continuous) p. 37 Why Is Unimodal Gaussian not Enough (1/3)  A univariate example  Histogram of an image Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 38. 王元凱 Unit - Uncertainty Inference (Continuous) p. 38 Why Is Unimodal Gaussian not Enough (2/3)  Bivariate example One Gaussian PDF Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 39. 王元凱 Unit - Uncertainty Inference (Continuous) p. 39 Why Is Unimodal Gaussian not Enough (3/3)  To solve it Mixture of Three Gaussians Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 40. 王元凱 Unit - Uncertainty Inference (Continuous) p. 40 Gaussian Mixture Model (GMM) 2.1 Combine Multiple Gaussians 2.2 Formula of GMM 2.3 Parameter Estimation of GMM Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 41. 王元凱 Unit - Uncertainty Inference (Continuous) p. 41 2.1 Combine Multiple Gaussians • Unimodal Gaussian (Single Gaussian) 1  1  exp    x      x     1 T p( x )   2   2  n 2  1 2 • Multi-modal Gaussians (Multiple Gaussians) 1  1  exp    x  1  1 1  x  1    T p( x )   2  1  2  n2 12 1  1  exp    x  2   21  x  2    T   2   2  2  n2 12 ... Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 42. 王元凱 Unit - Uncertainty Inference (Continuous) p. 42 Combine 2 Gaussians (1/4)  Suppose  Two Gaussians in 1-dimension p( x)  p( x | 1 ,1)  p( x | 2 , 2 ) 1   x  i 2  p  x | i ,  i   exp    , i  1, 2 2  i  2 i  2    p(x) = p(x | C1) + p(x | C2)  p(x|Ci)dx = 1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 43. 王元凱 Unit - Uncertainty Inference (Continuous) p. 43 1-D Example (2/4) 1=4, 1=0.3 1=0.6 2=6.4, 2=0.5 2=0.4 1 ( x  4) 2 p( x )  exp( ) 2  0.3 2  0.3 2 1 ( x  6.4) 2  exp( ) 2  0.5 2  0.5 2 Given x=5 1 (5  4) 2 p( x  5)  exp( ) 2  0.3 2  0.3 2 1 (5  6.4) 2  exp( ) 2  0.5 2  0.52 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 44. 王元凱 Unit - Uncertainty Inference (Continuous) p. 44 Combine 2 Gaussians (3/4) 2 Gaussians Gaussian Mixture 0.5 0.5 0.45 0.45 0.4 0.4  N(0,1)  N(3,1)  N(0,1)  N(3,1) 0.35 0.35 0.3 0.3 p(x) p(x) 0.25 0.25 0.2 0.2 0.15 0.15 0.1 0.1 0.05 0.05 0 0 -4 -3 -2 -1 0 1 2 3 4 5 6 7 -4 -3 -2 -1 0 1 2 3 4 5 6 7 x x N(0,1)=p(x|0,1) p(x)=p(x|0,1)+p(x|3,1) N(3,1)=p(x|3,1) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 45. 王元凱 Unit - Uncertainty Inference (Continuous) p. 45 Combine 2 Gaussians (4/4) 2 Gaussians Gaussian Mixture 0.5 0.5 0.45 0.45 0.4 0.4  N(0,1)  N(0,1) 0.35 0.35 0.3 0.3 p(x) p(x) 0.25 0.25 0.2  N(3,4) 0.2  N(3,4) 0.15 0.15 0.1 0.1 0.05 0.05 0 0 -4 -3 -2 -1 0 1 2 3 4 5 6 7 -4 -3 -2 -1 0 1 2 3 4 5 6 7 x x N(0,1)=p(x|0,1) p(x)=p(x|0,1)+p(x|3,4) N(3,4)=p(x|3,4) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 46. 王元凱 Unit - Uncertainty Inference (Continuous) p. 46 Combine 2 Gaussians with Weights (1/3) p(x) = p(x | C1) + p(x | C2)  p(x|Ci)dx = 1  p(x)dx =  p(x|C1)dx +  p(x|C2)dx = 1 + 1 = 2 If p(x) = ½ p(x | C1) + ½ p(x | C2)  p(x)dx = ½  p(x|C1)dx + ½  p(x|C2)dx = 1 If p(x) = 1 p(x | C1) + 2 p(x | C2), 1+2=1  p(x)dx = 1 p(x|C1)dx + 2 p(x|C2)dx = 1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 47. 王元凱 Unit - Uncertainty Inference (Continuous) p. 47 Combine 2 Gaussians with Weights (2/3) 2 Gaussians Gaussian Mixture 0.5 0.5 0.45 0.45 0.4 0.4  N(0,1)  N(3,1)  N(0,1)  N(3,1) 0.35 0.35 0.3 0.3 p(x) p(x) 0.25 0.25 0.2 0.2 0.15 0.15 0.1 0.1 0.05 0.05 0 0 -4 -3 -2 -1 0 1 2 3 4 5 6 7 -4 -3 -2 -1 0 1 2 3 4 5 6 7 x x N(0,1)=p(x|0,1) p(x) = ½ * p(x|0,1) N(3,1)=p(x|3,1) + ½ * p(x|3,1) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 48. 王元凱 Unit - Uncertainty Inference (Continuous) p. 48 Combine 2 Gaussians with Weights (3/3) 2 Gaussians Gaussian Mixture 0.5 0.5 0.45 0.45 0.4 0.4  N(0,1)  N(0,1) 0.35 0.35 0.3 0.3 p(x) p(x) 0.25 0.25 0.2  N(3,4) 0.2  N(3,4) 0.15 0.15 0.1 0.1 0.05 0.05 0 0 -4 -3 -2 -1 0 1 2 3 4 5 6 7 -4 -3 -2 -1 0 1 2 3 4 5 6 7 x x N(0,1)=p(x|0,1) p(x) = ½ * p(x|0,1) N(3,4)=p(x|3,4) + ½ * p(x|3,4) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 49. 王元凱 Unit - Uncertainty Inference (Continuous) p. 49 Combine 2 Gaussians with Different Mean Distances (1/2)  Suppose  Two Gaussians in 1D p( x)  p( x | 1 ,1)  p( x |  2 , 2 ) 1 2 1 2  Let   = 1  Let =    Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 50. 王元凱 Unit - Uncertainty Inference (Continuous) p. 50 Combine 2 Gaussians with Different Mean Distances (2/2) =1 =2 =3 =4 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 51. 王元凱 Unit - Uncertainty Inference (Continuous) p. 51 Combine 2 Gaussians with Different Weights (1/2)  Suppose  Two Gaussians in 1D p( x)  0.75 p( x | 1 ,1 )  0.25 p( x |  2 , 2 )  Let   = 1  Let =    Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 52. 王元凱 Unit - Uncertainty Inference (Continuous) p. 52 Combine 2 Gaussians with Different Weights (2/2) =1 =2 =3 =4 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 53. 王元凱 Unit - Uncertainty Inference (Continuous) p. 53 2D Gaussian Combination (1/2) 4 0 4 0 p( x | C1 ), 1  (0, 0), 1    , p( x | C 2 ), 2  (0,3), 1   0 4  0 4   Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 54. 王元凱 Unit - Uncertainty Inference (Continuous) p. 54 2D Gaussian Combination (2/2) p(x) = p(x|C1) + p(x|C2) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 55. 王元凱 Unit - Uncertainty Inference (Continuous) p. 55 More Gaussians  As no. of Gaussians, M, increases, it can represent any possible density  By adjusting M, and , ,  of each Gaussians M p ( x )    i p ( x| C i ) i 1 M    i p ( x|  i ,  i ) i 1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 56. 王元凱 Unit - Uncertainty Inference (Continuous) p. 56 2 p(x) 1.5 5 Gaussians Component Models 1 0.5 0 -5 0 5 10 0.5 0.4 Mixture Model 0.3 p(x) 0.2 0.1 0 -5 0 5 10 Fu Jen University x Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 57. 王元凱 Unit - Uncertainty Inference (Continuous) p. 57 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 58. 王元凱 Unit - Uncertainty Inference (Continuous) p. 58 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 59. 王元凱 Unit - Uncertainty Inference (Continuous) p. 59 2.2 Formula of GMM  A Gaussian mixture model (GMM) is a linear combination of M Gaussians M p(x)  i 1  i p(x |Ci) • P(x) is the probability of a point x •x=(Cb, Cr) or (R,G,B) or ... • i is mixing parameter (weight) • p(x|Ci) is a Gaussian function Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 60. 王元凱 Unit - Uncertainty Inference (Continuous) p. 60 Comparison of Formula 1  1  Gaussian: p( x )  exp    x      x     1 T  2   2  n2  12 M GMM : p ( x )    i p ( x| C i ) i 1 M i  1   exp    x  i   i  x  i   1 T  2   2  n2 i 12 i 1  In GMM, p(x|Ci) means the probability of x in the i Gaussian component Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 61. 王元凱 Unit - Uncertainty Inference (Continuous) p. 61 Two Constraints of GMM M • i  i  1, and 0   i  1 i 1 • p(x|Ci) • It is normalized, • i.e., p(x|Ci)dx = 1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 62. 王元凱 Unit - Uncertainty Inference (Continuous) p. 62 The Problem (1/5)  Now we know any density can be obtained by Gaussian mixture  Given the mixture function, we can plot its density  But in reality, what we need to do in computer is  We get a lot of data point ={xj Rn, j=1,…N} with unknown density  Can we find the mixture function of these data points? Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 63. 王元凱 Unit - Uncertainty Inference (Continuous) p. 63 0.5 0.4 Histograms of ={xj Rn, j=1,…N} Mixture Model 0.3 p(x) 0.2 0.1 0 -5 0 5 10 x 2 1.5 5 Gaussians Component Models p(x) 1 0.5 0 -5 Fu Jen University Department of Electrical Engineering 5 0 Wang, Yuan-Kai Copyright 10
  • 64. 王元凱 Unit - Uncertainty Inference (Continuous) p. 64 The Problem (3/5)  To find the mixture function means to estimate the parameters of the mixture function  Mixing parameters   Gaussian component densities  Mean vector i  Covariance matrix i  Number of components M Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 65. 王元凱 Unit - Uncertainty Inference (Continuous) p. 65 The Problem (4/5) No. of Parameters A Gaussian 1D Gaussian 2D Gaussian 3D Gaussian  1 2 3  () 1 3 6 Total 2 5 9 GMM with M Gaussians 1D GMM 2D GMM 3D GMM  M M M  1M 2M 3M  () 1M 3M 6M Total 3M 6M 10M Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 66. 王元凱 Unit - Uncertainty Inference (Continuous) p. 66 The Problem (5/5)  That is, given {xj Rn, j=1,…N} M p ( x1 )   i 1 i p ( x1 |  i ,  i ) M p ( x2 )   i 1 i p ( x2 |  i ,  i ) Solve i, i, i ... M Also called p( xN )    i p( xN | i ,  i ) parameter estimation i 1  Usually we use i to denote (i, i) M p( x)    i p( x |  i ) i 1 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 67. 王元凱 Unit - Uncertainty Inference (Continuous) p. 67 2.3 Parameter Estimation  Given  Fixed M  Data ={xj Rn, j=1,…N}  We may calculate the histogram of   We want to find the parameters  = (1, ..., M, 1, ..., M, 1, ...,M) that best fit the histogram of data  Examples  1-D example: xj R1  Two 2-D examples: xj R2 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 68. 王元凱 Unit - Uncertainty Inference (Continuous) p. 68 1-D Example ={1.5, -0.2, 1.4, 1.8, ... } Histogram •Nx=-2.5=10 •... •Nx=1.5=40 •... Parameter Estimation =1.5, =1.3 1 (x1.5)2 p(x)  exp( ) 21.3 21.32 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 69. 王元凱 Unit - Uncertainty Inference (Continuous) p. 69 M=7 M=7 (Izenman & (Basford et al.) Sommer) M=3 M=4 (equal variances) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 70. 王元凱 Unit - Uncertainty Inference (Continuous) p. 70 2-D Example ={(3.45,4.02), ... } ANEMIA PATIENTS AND CONTROLS 4.4 4.3 Red Blood Cell Hemoglobin Concentration 4.2 4.1 4 3.9 3.8 3.7 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4 Fu Jen University Department ofRed Blood CellEngineering Electrical Volume Wang, Yuan-Kai Copyright
  • 71. 王元凱 Unit - Uncertainty Inference (Continuous) p. 71 EM ITERATION 1 4.4 Red Blood Cell Hemoglobin Concentration 4.3 Initialization 4.2 4.1 4 3.9 3.8 3.7 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4 Fu Jen University Red Blood Cell Volume Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 72. 王元凱 Unit - Uncertainty Inference (Continuous) p. 72 EM ITERATION 3 4.4 Red Blood Cell Hemoglobin Concentration 4.3 4.2 4.1 4 3.9 3.8 3.7 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4 Fu Jen University Red Blood Cell Volume Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 73. 王元凱 Unit - Uncertainty Inference (Continuous) p. 73 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 74. 王元凱 Unit - Uncertainty Inference (Continuous) p. 74 EM ITERATION 10 4.4 Red Blood Cell Hemoglobin Concentration 4.3 4.2 4.1 4 3.9 3.8 3.7 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4 Fu Jen University Red Blood Cell Volume Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 75. 王元凱 Unit - Uncertainty Inference (Continuous) p. 75 EM ITERATION 15 4.4 Red Blood Cell Hemoglobin Concentration 4.3 4.2 4.1 4 3.9 3.8 3.7 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4 Fu Jen University Red Blood Cell Volume Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 76. 王元凱 Unit - Uncertainty Inference (Continuous) p. 76 EM ITERATION 25 4.4 Red Blood Cell Hemoglobin Concentration 4.3 4.2 4.1 4 3.9 3.8 3.7 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4 Fu Jen University Red Blood Cell Volume Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 77. 王元凱 Unit - Uncertainty Inference (Continuous) p. 77 LOG-LIKELIHOOD AS A FUNCTION OF EM ITERATIONS 490 480 470 460 Log-Likelihood 450 440 430 420 410 400 0 5 10 15 20 25 Fu Jen University EM Iteration Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 78. 王元凱 Unit - Uncertainty Inference (Continuous) p. 78 ANEMIA DATA WITH LABELS 4.4 Red Blood Cell Hemoglobin Concentration 4.3 4.2 Control Group 4.1 4 3.9 Anemia Group 3.8 3.7 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4 Fu Jen University Red Blood Cell Volume Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 79. 王元凱 Unit - Uncertainty Inference (Continuous) p. 79 Parameter Estimation of GMM  Two methods  Maximum Likelihood Estimation (MLE)  Expectation Maximization (EM)  Discussed in Lecture Note 24 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 80. 王元凱 Unit - Uncertainty Inference (Continuous) p. 80 3. Linear Gaussian  P(X) can belong to a distribution  Ex, Gaussian, uniform, Exponential, Gaussian mixture)  P(Y|X) : conditional probability  Can the conditional probability belong to a distribution?  Linear Gaussian describes  The distribution of conditional probability as Gaussian  The dependence between random variables Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 81. 王元凱 Unit - Uncertainty Inference (Continuous) p. 81 From Gaussian to Linear Gaussian (1/2)  If two variables x and y has linear relationship, we say  y = ax + b, y becomes a  a and b are parameters random variable  If y belongs to Gaussian distribution  y ~ N(, ) = N(y;, ) N(y)  But y = ax+b   ax+b ~ N(, )  = N(ax+b, )  But we can write -  + N(y; ax+b, ) y Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 82. 王元凱 Unit - Uncertainty Inference (Continuous) p. 82 Linear Gaussian=N(y;ax+b, )  The meaning of linear Gaussian N(y; ax+b, ) • When y=ax+b, N(y) N(y) is the maximum probability • However, yax+b occurs •with lower probability, -3 -  + +3 y •decayed as in Gaussian distribution = ax+b Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 83. 王元凱 Unit - Uncertainty Inference (Continuous) p. 83 P(y|x) = N(y;ax+b, )  P(Xj=y|Xi=x)=P(y|x)  If P(y|x)=N(y; ax+b, )  Xj varies linearly with Xi  With Gaussian uncertainty  Standard deviation is fixed P(Xj | Xi) P( X j y | X i  x)  N ( y; ax  b, ) 1  ( y  (ax  b))2   exp     2   2 2  Xi Xj Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 84. 王元凱 Unit - Uncertainty Inference (Continuous) p. 84 Example 1 (1/2)  Illumination change of an image f(x,y) g(x,y) g(x,y) = af(x,y)+b g =af+b •Two kinds of illumination change(same a,b) •Real light change : g1 •Changed by image processing software: g2 •g1 = g2 : No •Because of noise, g1 is a random variable Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 85. 王元凱 Unit - Uncertainty Inference (Continuous) p. 85 Example 1 (2/2)  Illumination change of an image f(x,y) g1(x,y) g1 =af+b •But g1 is a random variable •It means g1 undergoes a noise •g1 ~ af + b + N(0,) = N(af+b, ) •Or P(g1|f) = N(af+b, ) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 86. 王元凱 Unit - Uncertainty Inference (Continuous) p. 86 Extension: Linear Transform  X and Y are two vectors  Y = (y1, y2, …, ym)T  X = (x1, x2, …, xm)T  X and Y are linearly dependent  Y = AX + B : Linear transform  If Y becomes a random vector  P(Y|X)=N(Y; AX+B, ∑) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 87. 王元凱 Unit - Uncertainty Inference (Continuous) p. 87 Example 2 (1/5)  Illumination change of color image F(x,y) G(x,y) G =AF+B •F(x,y)=(rF(x,y), gF(x,y), bF(x,y)T •G(x,y)=(rG(x,y), gG(x,y), bG(x,y)T  rG   a11 a12 a13   rF   b1  G =AF+B   g   a  G   21 a22 a23   g F   b2       bG   a31    a32 a33   bF  b3      Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 88. 王元凱 Unit - Uncertainty Inference (Continuous) p. 88 Example 2 (2/5)  A simple case of G=AF+B  aij=0 if i≠j  rG  2 0 0  rF  1  g    0 3 0   g   0   G   F     bG  0 0 1  bF  0         RG=a11RF+b1, GG=a22GF+b2, BG=a33BF+b3  rG   a11 a12 a13   rF   b1   g   a a22 a23   g F   b2   G   21      bG  a31    a32 a33   bF  b3      Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 89. 王元凱 Unit - Uncertainty Inference (Continuous) p. 89 Example 2 (3/5)  If G has noises F(x,y) G1(x,y) G1 =AF+B •G1(x,y) is a random vector •It means G1 undergoes a noise  11  12  13  •G1 ~ AF + B + N(0,∑)    21  22  23  = N(AF+B, ∑)    31   32  33   •Or P(G1|F) = N(AF+B, ∑) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 90. 王元凱 Unit - Uncertainty Inference (Continuous) p. 90 Example 2 (4/5)  G1 ~ N(AF+B, ∑)  N is a multivariate Gaussian (3-D) exp 2 ( X  μ) Σ ( X  μ )  1 1 p( X )  1 1 T 2 || Σ || 2  a11rF  a12 g F  a13bF  b1   11  12  13  a r  a g  a b  b        AF  B   21 F 22 F 23 F 2   21  22  23   a31rF  a32 g F  a33bF  b3     31  32  33    Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 91. 王元凱 Unit - Uncertainty Inference (Continuous) p. 91 Example 2 (5/5)  Assume a simple case 4 0 0  aij=0 if i≠j  2 0 0 0 3 0   0 8 0  ij=0 if i≠j A    0 0 1    0 0  2   Then   a11rF  a12 g F  a13bF  b1   a11rF  b1    AF  B  a21rF  a22 g F  a23bF  b2   a22 g F  b2       a31rF  a32 g F  a33bF  b3   a33bF  b3       P(rG|rF) = N(a11rF+b1, 1)  P(gG|gF) = N(a21gF+b2, 2)  P(bG|bF) = N(a31bF+b3, 3) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 92. 王元凱 Unit - Uncertainty Inference (Continuous) p. 92 4. Sampling  Generate N samples S from P(X)  S=(s1,s2, …, sN)  X can be a random variable or a random vector  If X=x, si=xi  If X=(x1,x2,…,xn), si=(x1i,x2i,…,xni)  Why generate N samples?  Estimate probabilities by frequencies # samples with X  x P( X  x)  N Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 93. 王元凱 Unit - Uncertainty Inference (Continuous) p. 93 Example (1/2)  A simple example: coin toss  Tossing the coin, get head or tail  It is a Boolean random variable  coin = head or tail  Random variable, but not random vector  If it is unbiased coin, head and tail have equal probability  A prior probability distribution P(Coin) = <0.5, 0.5>  Uniform distribution  But we do not know it is unbiased Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 94. 王元凱 Unit - Uncertainty Inference (Continuous) p. 94 Example (2/2)  Sampling in this example = flipping the coin many times N  e.g., N=1000 times  Ideally, 500 heads, 500 tails  P(head) = 500/1000=0.5 P(tail) = 500/1000=0.5  Practically, 5001 heads, 499 tails  P(head) = 501/1000=0.501 P(tail) = 499/1000=0.499  By the sampling, we can estimate the probability distribution Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 95. 王元凱 Unit - Uncertainty Inference (Continuous) p. 95 Sampling (Math)  For a Boolean random variable X  P(X) is prior distribution = <P(x), P(x)>  Using a sampling algorithm to generate N samples  Say N(x) is the number of samples that x is true, N(x) of x is false N (x)  Pˆ ( x ), N (  x )  P (  x ) ˆ N N N ( x) N (x) lim  P( x), lim  P(x) N  N N  N Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 96. 王元凱 Unit - Uncertainty Inference (Continuous) p. 96 Sampling Algorithm  It is the algorithm to  Generate samples from a known probability distribution  Estimate the approximate probability Pˆ  How does a sampling algorithm generate a sample?  C/C++: rand()  Return 0 ~ RAND_MAX (32767)  Java: random() Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 97. 王元凱 Unit - Uncertainty Inference (Continuous) p. 97 A Sampling Algorithm of the Coin Toss  Flip the coin 1000 times  int coin_face; for (i=0; i<1000; i++) { if (rand() > RAND_MAX/2) coin_face = 1; else coin_face = 0; } What kind of distribution? Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 98. 王元凱 Unit - Uncertainty Inference (Continuous) p. 98 Sampling Algorithms for Many R.V.s (1/2)  3 Boolean random variables X, Y, Z  (X=1, Y=0, Z=0) is called a sample  int X, Y, Z; for (i=0; i<1000; i++) { if (rand() > RAND_MAX/2) X = 1; else X = 0; if (rand() > RAND_MAX/2) Y = 1; else Y = 0; if (rand() > RAND_MAX/2) Z = 1; else Z = 0; } X, Y, Z are all uniform distribution Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 99. 王元凱 Unit - Uncertainty Inference (Continuous) p. 99 Sampling Algorithms for Many R.V.s (2/2)  Y, Z are not uniform distribution  P(Y)=<0.67, 0.33>, P(Z)=<0.25,0.75>  int X, Y, Z; for (i=0; i<1000; i++) { if (rand() > RAND_MAX/2) X = 1; else X = 0; if (rand() > RAND_MAX/3) Y = 1; else Y = 0; if (rand() > RAND_MAX/4) Z = 1; else Z = 0; } Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 100. 王元凱 Unit - Uncertainty Inference (Continuous) p. 100 Various Sampling Algorithms  For more complex P(X), we need more complex sampling algo.  Stochastic simulation  Direct Sampling  Rejection sampling  Reject samples disagreeing with evidence  Likelihood weighting  Use evidence to weight samples  Markov chain Monte Carlo (MCMC)  Also called Gibbs sampling Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 101. 王元凱 Unit - Uncertainty Inference (Continuous) p. 101 Example  Approximate reasoning for Bayesian networks  TBU Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 102. 王元凱 Unit - Uncertainty Inference (Continuous) p. 102 5. Markov Chain  Markov Assumption  Each state at time t only depends on the state (Andrei Andreyevich Markov) at time t-1  Ex. The weather today only depends on the weather of yesterday X1 X2 X3 t=1 t=2 t=3 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 103. 王元凱 Unit - Uncertainty Inference (Continuous) p. 103 Deterministic v.s. Non-Deterministic  Deterministic patterns :  Traffic light  FSMs  …  Non-Deterministic patterns :  Weather  Speech  Tracking  … Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 104. 王元凱 Unit - Uncertainty Inference (Continuous) p. 104 Example – Weather Prediction (1/2)  Only 3 possible weather states :  Sunny, Cloudy, Rainy  Transition Matrix :  A=Pr( today | yesterday) Weather Today Sunny Cloudy Rainy Weather Sunny 0.5 0.25 0.25 Yesterday Cloudy 0.375 0.125 0.375 Rainy 0.125 0.625 0.375 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 105. 王元凱 Unit - Uncertainty Inference (Continuous) p. 105 Example – Weather Prediction (2/2)  Suppose we know the weather of previous days  t=1: rainy R S S C  t=2: sunny X1 X2 X3 X4  t=3: sunny t=1 t=2 t=3 t=4  t=4: cloudy  Predict the weather of day t=5 X5  ? t=5 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 106. 王元凱 Unit - Uncertainty Inference (Continuous) p. 106 6. Stochastic Process  Also called Random Process  It is a collection of random variables  For each t in the index set T, X(t) is a random variable  Usually t refers to time, and X(t) is the state of the process at time t  X(t) can be discrete or continuous Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 107. 王元凱 Unit - Uncertainty Inference (Continuous) p. 107 Graphical View of Stochastic Process Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 108. 王元凱 Unit - Uncertainty Inference (Continuous) p. 108 Statistics of Stochastic Process  Mean of X(t)  Variance, standard deviation of X(t)  Frequency distribution of X(t) : P(X)  Conditional probability of X(t) : P(X(t) | X(t-1)) Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 109. 王元凱 Unit - Uncertainty Inference (Continuous) p. 109 Markov Chain  A Markov chain is a stochastic process where  P(X(t+1) | X(t), X(t-1), …, X(0)) = P(X(t+1) | X(t)), or  P(Xn+1 | Xn, Xn-1, …, X0) = P(Xn+1 | Xn)  Next state depends only on current state  Future and past are conditionally independent given current Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 110. 王元凱 Unit - Uncertainty Inference (Continuous) p. 110 Higher-order Markov Chain  Second-order Markov chain  P(X(t+1) | X(t), X(t-1), …, X(0)) = P(X(t+1) | X(t), X(t-1)) X1 X2 X X4 3 t=1 t=2 t=3 t=4  Third-order, n-order Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 111. 王元凱 Unit - Uncertainty Inference (Continuous) p. 111 Stationary Process  The probability distribution of X is independent of t X1 X2 X X4 3 t=1 t=2 t=3 t=4 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 112. 王元凱 Unit - Uncertainty Inference (Continuous) p. 112 Doubly Stochastic Process  Hidden variable X1 X2 X3 Y1 Y2 Y3 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 113. 王元凱 Unit - Uncertainty Inference (Continuous) p. 113 Example - Video  Facial expression recognition  TBU Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 114. 王元凱 Unit - Uncertainty Inference (Continuous) p. 114 7. Reference  Moments  Hu, M. K., [1962]. “Visual Pattern Recognition by Moment Invariants.” IRE Trans. Info. Theory, vol. IT-8, pp. 179-187.  Gonzalez, R.C. and R. E. Woods. [2001]. Digital Image Processing, 2nd, Prentice Hall, pp.659- 660, 672-675.  L. Rocha, et. al., “Image Moments-Based Structuring and Tracking of Objects,” Proceedings of the XV Brazilian Symposium on Computer Graphics and Image Processing, 2002. Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 115. 王元凱 Unit - Uncertainty Inference (Continuous) p. 115 Reference  Gaussian mixture  C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995.  McLachlan and Peel, Finite Mixture Models, John Wiley & Sons,  Rennie, “A Short Tutorial on Using EM with Mixture Models,” MIT Tech. Report, 2004. http://www.ai.mit.edu/ people/jrennie/writing/mixtureEM.pdf.  Tomasi, “Estimating Gaussian Mixture Density with EM: a tutorial,” Duke University,  Y. Weiss. Motion segmentation using EM – a short tutorial. 1997. http://www.cs.huji.ac.il/˜yweiss/emTutorial.pdf Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
  • 116. 王元凱 Unit - Uncertainty Inference (Continuous) p. 116 Reference  Linear Gaussian  Russell & Norvig, Artificial Intelligence: a modern approach, 2nd, Prentice Hall, 2003. Sec. 14.3, pp.501-503  Sampling  Russell&Norvig, Artificial Intelligence: a modern approach, 2nd, Prentice Hall, 2003. Sec. 14.5, pp.511-518  Stochastic process  Probability, Random variables and Random Signal Principles, 4e, Peebles, McGraw Hill, 2001 Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright