SlideShare ist ein Scribd-Unternehmen logo
1 von 16
Downloaden Sie, um offline zu lesen
Course Calendar
Class DATE Contents
1 Sep. 26 Course information & Course overview
2 Oct. 4 Bayes Estimation
3 〃 11 Classical Bayes Estimation - Kalman Filter -
4 〃 18 Simulation-based Bayesian Methods
5 〃 25 Modern Bayesian Estimation :Particle Filter
6 Nov. 1 HMM(Hidden Markov Model)
Nov. 8 No Class
7 〃 15 Bayesian Decision
8 〃 29 Non parametric Approaches
9 Dec. 6 PCA(Principal Component Analysis)
10 〃 13 ICA(Independent Component Analysis)
11 〃 20 Applications of PCA and ICA
12 〃 27 Clustering, k-means et al.
13 Jan. 17 Other Topics 1 Kernel machine.
14 〃 22(Tue) Other Topics 2
Lecture Plan
Independent Component Analysis
-1.Whitening by PCA
1. Introduction
Blind Source Separation(BSS)
2. Problem Formulation and independence
3. Whitening + ICA Approach
4. Non-Gaussianity Measure
References:
[1] A. Hyvarinen et al. “Independent Component Analysis”Wiley-InterScience, 2001
3
- 1. Whitening by PCA (Preparation for ICA approach)
Whitened := Uncorrelatedness + Unity variance*
PCA is a very useful tool to transform a random vector x to an
uncorrelated or whitened z.
Matrix V is not uniquely defined, so we have free parameter.
(In 2-d case: rotation parameter)
* Here we assume all random variables are zero mean
PCA gives one solution of whitening issue.
z Vx
n-vector
matrix (unity matrix)z
z Vx
n n

 C = I
n-vector
covariance matrix
x
n n
Fig. 1
4
PCA whitening method:
- Define the covariance matrix
- {Eigenvalue, Eigenvector} pairs of Cx
- Representation of Cx
(*)
- Whitening matrix transformation
* The matrix E is orthogonal matrix that satisfies EET=ETE=I
 T
x E xxC
1
2
1 2
T
T
T
x n
T
n
 
    
         
         
 
e
e
C e e e EΛE
e

 { , 1 }i i i n e
 
1 1
2 2
1 2
1 1
: ( , , )T
T
z
x diag
E
 
 
 
 
z Vx , V = EΛ E
C zz I

(1)
(2)
(3)
5
1. Introduction
Blind Source Separation (BSS)
Ex. Source signals Mixed signals
BSS Problem: Recover or separate source signals with no prior
information on mixing matrix [ aij ]. Typical BSS problem in real world
is known as “Cocktail party problem”
- Independent Component Analysis (ICA) utilizes the independence of
source signals to solve BSS issue.
   1 3s t s t    
3
1
( 1,2,3)i ij j
j
x t a s t i

 
Fig.2
mic1
mic2
Source 1
s1(t)
Source 2
s2(t) y2(t)
y1(t)
ICA Solution of BSS
?
Mixing
Process
Separation
Process
Independency degree
Fig.3
7
2. Problem Formulation and Independence
- Source signals (zero mean): sj (t), j=1~n
- Recorded signals : xi (t), i=1~n
- Linear Mixing Process (No delay model *)
(* In real environment, the arrival time differences between microphones should be involved in the mixing model)
Recover the sources sj (t), j=1~n from mixed signals xi (t) i=1~n
- aij are unknown
- We want to get both aij and si (t) (-aij , -si (t))
Under the following assumptions
   
   
1
1 1
( 1 )
Vector-Marix form:
=
where ( constant matrix) ,
, , , ,
n
i ij j
j
ij
T T
n n
x t a s t i n
a n n
x x s s

 
   
 

x As
A
x s
(4)
(5)
8
Assumption 1: Source waveforms are statistically independent
Assumption 2: The sources have non-Gaussian distribution
Assumption 3: Matrix A is square and invertible (for simplicity)
Estimated A is used to recover the original signals by inverse (de-
mixing) operation.
s=Bx where B=A-1
Possible Solutions by ICA -To ambiguities-
- The Variance(amplitude) of recovered signals
Because, if a pair of is the solution of the underlying
BSS problem, then is also the other solution.
Variance of source signals are assumed to be unity:
- The order of recovered signals cannot be determined (Permutation)
  ,ij ja s t
2
1jE s   
 
1
,ij jKa s t
K
 
 
 
(6)
Basics Independence and Uncorrelatedness
Statistical independence of two random variables x and y
Knowing the value of x does not provide information about the
distribution of y .
Example:
Uncorrelatedness of two random variables x and y
their covariance is zero, i.e.
9
         , ,y y x y x yp y x p y p x y p x p y  
 
1
0
- variables , , are uncorrelated then we have
is diagonal matrix
n
T
x
E xy
x x
E

   C xx
Fig.4
(7)
Statistical Independence
    0x yE x m y m  
Uncorrelatedness
     x,y x yp x,y p x p y
- Independence means uncorrelatedness
- Gaussian density case
Independence = Uncorrelatedness
11
Examples of Sources and Mixed Signals
[Uniform densities (Sub-Gaussian)]
- Two independent components s1, s2 that have the same
uniform density
- Mixing matrix
 
     1 2 1 2
1
3
,2 3
0
Variance of 1
,
i
i
i
s
p s
otherwise
s
p s s p s p s


 




1 2 1 1 2 2
5 10
where
10 2
5 10
10 2
x As
A
x s s s s

 
  
 
   
      
   
a a
2a
1a
Fig.5
Fig.6
(8)
12
[Super-Gaussian densities]
- Two independents s1, s2 have super-Gaussian like Fig. ??
-
- Mixing signals
2a
1a
Super Gaussian Source signals joint distribution
Mixed signals distribution
Fig.8
Fig.7
13
3. Whitening(PCA) + ICA approach
Observed signals
This means y is also whitened signal.
Conclusion: Whitening gives the independent components only
up to an orthogonal transformation.
whitening ICA
x z s
1
2 T

z ED E x
=Vx
New Mixing matrix is Orthogonal matrix **( )


z Vx =VAs
= As
A VA
Question*: Is this unique solution?
   
Ans* ( orthogonal matrix)
T T T T
y
U
E E  
y = Uz
C yy Uzz U UIU = I
   ** T T T T
E zz AE ss A AA I  
14
Why Gaussian variables are forbidden (Assumption 2)
Suppose two independent signals s1 and s2 are Gaussian
and
we obtain
Joint Gaussian distribution is invariant with respect to orthogonal
transformation. This means that we cannot find (identify)
orthogonal matrix A from the mixed data.
 
2 2
1 2
1 2
2 2
1 2
1 1
, exp exp
2 22 2
1 1 1
exp exp
2 2 2 2
T
s s
p s s
s s
 
 
   
     
   
   
      
  
s s
 
1
1
1 2
( =orthogonal matrix, )
1 1 1 1
, exp exp
2 2 2 2
T T
T T
p z z
 

   
      
   
z = As A A = A , s = A z
z AA z z z
The same density is
observed = no
information arises by
the orthonormal
transformation
(9)
15
We need to answer the following two questions.
1) How can the non-Gaussianity of y is measured?
2) How can we compute the values of B that maximize the measure?
Maximization of Non-Gaussianity
For a given density p(x), we define a measure of non-Gaussianity
NG(p(x)) (non-negative, =0 if p(x) is Gaussian )
NG
[Intuitive interpretation of ICA as non-Gaussianity minimization]
0
Gaussian p(x) Non-Gaussian
1
1
is non-Gaussian
: ( ) is unknown
( ( )) as a function of ( ) is maximuzed when s
i
T
T
n
T T T
i i
x As s
b
s A x Bx x
b
y b x b As q s b q Ab
p y b q y q


 
 
    
 
 
   
NG
(10)
(11)
16
NG(py)0
y=q1s1 +q2s2 y=q1s1
y=q2s2
Reduce NG by mixing
Maximization by alternating b: As y
→qisi NG tends to be maximized
4. Measure of non-Gaussianity
Kurtosis is a classical measure of non-Gaussianity
The absolute value of the kurtosis can be used as s measure of
non-Gaussian. Optimization problem
is solved as an ICA solution.
    
2
4 2
( ) : 3Kurt p y E y E y   
0
Gaussian super-Gaussian
 ( )Kurt p y
sub-Gaussian
     , : ( )
b
Max J b J b Kurt p y
(12)
(13)

Weitere ähnliche Inhalte

Was ist angesagt?

NUMERICAL METHODS -Iterative methods(indirect method)
NUMERICAL METHODS -Iterative methods(indirect method)NUMERICAL METHODS -Iterative methods(indirect method)
NUMERICAL METHODS -Iterative methods(indirect method)krishnapriya R
 
Principal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationPrincipal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationTatsuya Yokota
 
linear system of solutions
linear system of solutionslinear system of solutions
linear system of solutionsLama Rulz
 
system of algebraic equation by Iteration method
system of algebraic equation by Iteration methodsystem of algebraic equation by Iteration method
system of algebraic equation by Iteration methodAkhtar Kamal
 
Tensor Train decomposition in machine learning
Tensor Train decomposition in machine learningTensor Train decomposition in machine learning
Tensor Train decomposition in machine learningAlexander Novikov
 
PosterPresentations.com-3 6x48-Template-V5 - 副本
PosterPresentations.com-3 6x48-Template-V5 - 副本PosterPresentations.com-3 6x48-Template-V5 - 副本
PosterPresentations.com-3 6x48-Template-V5 - 副本Yijun Zhou
 
Independent Component Analysis
Independent Component AnalysisIndependent Component Analysis
Independent Component AnalysisTatsuya Yokota
 
Nonnegative Matrix Factorization
Nonnegative Matrix FactorizationNonnegative Matrix Factorization
Nonnegative Matrix FactorizationTatsuya Yokota
 
E-Cordial Labeling of Some Mirror Graphs
E-Cordial Labeling of Some Mirror GraphsE-Cordial Labeling of Some Mirror Graphs
E-Cordial Labeling of Some Mirror GraphsWaqas Tariq
 
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...IOSR Journals
 
Regularity and complexity in dynamical systems
Regularity and complexity in dynamical systemsRegularity and complexity in dynamical systems
Regularity and complexity in dynamical systemsSpringer
 
Point Collocation Method used in the solving of Differential Equations, parti...
Point Collocation Method used in the solving of Differential Equations, parti...Point Collocation Method used in the solving of Differential Equations, parti...
Point Collocation Method used in the solving of Differential Equations, parti...Suddhasheel GHOSH, PhD
 
Conformable Chebyshev differential equation of first kind
Conformable Chebyshev differential equation of first kindConformable Chebyshev differential equation of first kind
Conformable Chebyshev differential equation of first kindIJECEIAES
 
Numerical analysis m2 l4slides
Numerical analysis  m2 l4slidesNumerical analysis  m2 l4slides
Numerical analysis m2 l4slidesSHAMJITH KM
 
Nonparametric approach to multiple regression
Nonparametric approach to multiple regressionNonparametric approach to multiple regression
Nonparametric approach to multiple regressionAlexander Decker
 

Was ist angesagt? (20)

Numerical Methods Solving Linear Equations
Numerical Methods Solving Linear EquationsNumerical Methods Solving Linear Equations
Numerical Methods Solving Linear Equations
 
NUMERICAL METHODS -Iterative methods(indirect method)
NUMERICAL METHODS -Iterative methods(indirect method)NUMERICAL METHODS -Iterative methods(indirect method)
NUMERICAL METHODS -Iterative methods(indirect method)
 
Section4 stochastic
Section4 stochasticSection4 stochastic
Section4 stochastic
 
Principal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationPrincipal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classification
 
linear system of solutions
linear system of solutionslinear system of solutions
linear system of solutions
 
system of algebraic equation by Iteration method
system of algebraic equation by Iteration methodsystem of algebraic equation by Iteration method
system of algebraic equation by Iteration method
 
Tensor Train decomposition in machine learning
Tensor Train decomposition in machine learningTensor Train decomposition in machine learning
Tensor Train decomposition in machine learning
 
PosterPresentations.com-3 6x48-Template-V5 - 副本
PosterPresentations.com-3 6x48-Template-V5 - 副本PosterPresentations.com-3 6x48-Template-V5 - 副本
PosterPresentations.com-3 6x48-Template-V5 - 副本
 
Independent Component Analysis
Independent Component AnalysisIndependent Component Analysis
Independent Component Analysis
 
Matlab Assignment Help
Matlab Assignment HelpMatlab Assignment Help
Matlab Assignment Help
 
Nsm
Nsm Nsm
Nsm
 
Nonnegative Matrix Factorization
Nonnegative Matrix FactorizationNonnegative Matrix Factorization
Nonnegative Matrix Factorization
 
Shape1 d
Shape1 dShape1 d
Shape1 d
 
E-Cordial Labeling of Some Mirror Graphs
E-Cordial Labeling of Some Mirror GraphsE-Cordial Labeling of Some Mirror Graphs
E-Cordial Labeling of Some Mirror Graphs
 
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
 
Regularity and complexity in dynamical systems
Regularity and complexity in dynamical systemsRegularity and complexity in dynamical systems
Regularity and complexity in dynamical systems
 
Point Collocation Method used in the solving of Differential Equations, parti...
Point Collocation Method used in the solving of Differential Equations, parti...Point Collocation Method used in the solving of Differential Equations, parti...
Point Collocation Method used in the solving of Differential Equations, parti...
 
Conformable Chebyshev differential equation of first kind
Conformable Chebyshev differential equation of first kindConformable Chebyshev differential equation of first kind
Conformable Chebyshev differential equation of first kind
 
Numerical analysis m2 l4slides
Numerical analysis  m2 l4slidesNumerical analysis  m2 l4slides
Numerical analysis m2 l4slides
 
Nonparametric approach to multiple regression
Nonparametric approach to multiple regressionNonparametric approach to multiple regression
Nonparametric approach to multiple regression
 

Ähnlich wie 2012 mdsp pr10 ica

Sequential Extraction of Local ICA Structures
Sequential Extraction of Local ICA StructuresSequential Extraction of Local ICA Structures
Sequential Extraction of Local ICA Structurestopujahin
 
Litv_Denmark_Weak_Supervised_Learning.pdf
Litv_Denmark_Weak_Supervised_Learning.pdfLitv_Denmark_Weak_Supervised_Learning.pdf
Litv_Denmark_Weak_Supervised_Learning.pdfAlexander Litvinenko
 
Independent Component Analysis
Independent Component Analysis Independent Component Analysis
Independent Component Analysis Ibrahim Amer
 
Principal Component Analysis
Principal Component AnalysisPrincipal Component Analysis
Principal Component AnalysisSumit Singh
 
Ch9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfCh9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfRahulUkhande
 
GATE Mathematics Paper-2000
GATE Mathematics Paper-2000GATE Mathematics Paper-2000
GATE Mathematics Paper-2000Dips Academy
 
Paper computer
Paper computerPaper computer
Paper computerbikram ...
 
Paper computer
Paper computerPaper computer
Paper computerbikram ...
 
Sisteme de ecuatii
Sisteme de ecuatiiSisteme de ecuatii
Sisteme de ecuatiiHerpy Derpy
 
ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)CrackDSE
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationDev Singh
 
Constant strain triangular
Constant strain triangular Constant strain triangular
Constant strain triangular rahul183
 
Pydata Katya Vasilaky
Pydata Katya VasilakyPydata Katya Vasilaky
Pydata Katya Vasilakyknv4
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsElvis DOHMATOB
 
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part I
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part INon-Gaussian Methods for Learning Linear Structural Equation Models: Part I
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part IShiga University, RIKEN
 

Ähnlich wie 2012 mdsp pr10 ica (20)

Sequential Extraction of Local ICA Structures
Sequential Extraction of Local ICA StructuresSequential Extraction of Local ICA Structures
Sequential Extraction of Local ICA Structures
 
Litv_Denmark_Weak_Supervised_Learning.pdf
Litv_Denmark_Weak_Supervised_Learning.pdfLitv_Denmark_Weak_Supervised_Learning.pdf
Litv_Denmark_Weak_Supervised_Learning.pdf
 
Independent Component Analysis
Independent Component Analysis Independent Component Analysis
Independent Component Analysis
 
Principal Component Analysis
Principal Component AnalysisPrincipal Component Analysis
Principal Component Analysis
 
Presentation on matrix
Presentation on matrixPresentation on matrix
Presentation on matrix
 
Estimation rs
Estimation rsEstimation rs
Estimation rs
 
Multivariate Methods Assignment Help
Multivariate Methods Assignment HelpMultivariate Methods Assignment Help
Multivariate Methods Assignment Help
 
Ch9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfCh9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdf
 
Lecture_note2.pdf
Lecture_note2.pdfLecture_note2.pdf
Lecture_note2.pdf
 
GATE Mathematics Paper-2000
GATE Mathematics Paper-2000GATE Mathematics Paper-2000
GATE Mathematics Paper-2000
 
Paper computer
Paper computerPaper computer
Paper computer
 
Paper computer
Paper computerPaper computer
Paper computer
 
Sisteme de ecuatii
Sisteme de ecuatiiSisteme de ecuatii
Sisteme de ecuatii
 
ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY Trajectoryeducation
 
Constant strain triangular
Constant strain triangular Constant strain triangular
Constant strain triangular
 
Pydata Katya Vasilaky
Pydata Katya VasilakyPydata Katya Vasilaky
Pydata Katya Vasilaky
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priors
 
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part I
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part INon-Gaussian Methods for Learning Linear Structural Equation Models: Part I
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part I
 
Ica group 3[1]
Ica group 3[1]Ica group 3[1]
Ica group 3[1]
 

Mehr von nozomuhamada

2012 mdsp pr08 nonparametric approach
2012 mdsp pr08 nonparametric approach2012 mdsp pr08 nonparametric approach
2012 mdsp pr08 nonparametric approachnozomuhamada
 
2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filternozomuhamada
 
2012 mdsp pr04 monte carlo
2012 mdsp pr04 monte carlo2012 mdsp pr04 monte carlo
2012 mdsp pr04 monte carlonozomuhamada
 
2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filternozomuhamada
 
2012 mdsp pr02 1004
2012 mdsp pr02 10042012 mdsp pr02 1004
2012 mdsp pr02 1004nozomuhamada
 
2012 mdsp pr01 introduction 0921
2012 mdsp pr01 introduction 09212012 mdsp pr01 introduction 0921
2012 mdsp pr01 introduction 0921nozomuhamada
 
招待講演(鶴岡)
招待講演(鶴岡)招待講演(鶴岡)
招待講演(鶴岡)nozomuhamada
 

Mehr von nozomuhamada (9)

2012 mdsp pr08 nonparametric approach
2012 mdsp pr08 nonparametric approach2012 mdsp pr08 nonparametric approach
2012 mdsp pr08 nonparametric approach
 
2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter
 
2012 mdsp pr04 monte carlo
2012 mdsp pr04 monte carlo2012 mdsp pr04 monte carlo
2012 mdsp pr04 monte carlo
 
2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter
 
2012 mdsp pr02 1004
2012 mdsp pr02 10042012 mdsp pr02 1004
2012 mdsp pr02 1004
 
2012 mdsp pr01 introduction 0921
2012 mdsp pr01 introduction 09212012 mdsp pr01 introduction 0921
2012 mdsp pr01 introduction 0921
 
Ieice中国地区
Ieice中国地区Ieice中国地区
Ieice中国地区
 
招待講演(鶴岡)
招待講演(鶴岡)招待講演(鶴岡)
招待講演(鶴岡)
 
最終講義
最終講義最終講義
最終講義
 

Kürzlich hochgeladen

[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGSujit Pal
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 

Kürzlich hochgeladen (20)

[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAG
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 

2012 mdsp pr10 ica

  • 1. Course Calendar Class DATE Contents 1 Sep. 26 Course information & Course overview 2 Oct. 4 Bayes Estimation 3 〃 11 Classical Bayes Estimation - Kalman Filter - 4 〃 18 Simulation-based Bayesian Methods 5 〃 25 Modern Bayesian Estimation :Particle Filter 6 Nov. 1 HMM(Hidden Markov Model) Nov. 8 No Class 7 〃 15 Bayesian Decision 8 〃 29 Non parametric Approaches 9 Dec. 6 PCA(Principal Component Analysis) 10 〃 13 ICA(Independent Component Analysis) 11 〃 20 Applications of PCA and ICA 12 〃 27 Clustering, k-means et al. 13 Jan. 17 Other Topics 1 Kernel machine. 14 〃 22(Tue) Other Topics 2
  • 2. Lecture Plan Independent Component Analysis -1.Whitening by PCA 1. Introduction Blind Source Separation(BSS) 2. Problem Formulation and independence 3. Whitening + ICA Approach 4. Non-Gaussianity Measure References: [1] A. Hyvarinen et al. “Independent Component Analysis”Wiley-InterScience, 2001
  • 3. 3 - 1. Whitening by PCA (Preparation for ICA approach) Whitened := Uncorrelatedness + Unity variance* PCA is a very useful tool to transform a random vector x to an uncorrelated or whitened z. Matrix V is not uniquely defined, so we have free parameter. (In 2-d case: rotation parameter) * Here we assume all random variables are zero mean PCA gives one solution of whitening issue. z Vx n-vector matrix (unity matrix)z z Vx n n   C = I n-vector covariance matrix x n n Fig. 1
  • 4. 4 PCA whitening method: - Define the covariance matrix - {Eigenvalue, Eigenvector} pairs of Cx - Representation of Cx (*) - Whitening matrix transformation * The matrix E is orthogonal matrix that satisfies EET=ETE=I  T x E xxC 1 2 1 2 T T T x n T n                              e e C e e e EΛE e   { , 1 }i i i n e   1 1 2 2 1 2 1 1 : ( , , )T T z x diag E         z Vx , V = EΛ E C zz I  (1) (2) (3)
  • 5. 5 1. Introduction Blind Source Separation (BSS) Ex. Source signals Mixed signals BSS Problem: Recover or separate source signals with no prior information on mixing matrix [ aij ]. Typical BSS problem in real world is known as “Cocktail party problem” - Independent Component Analysis (ICA) utilizes the independence of source signals to solve BSS issue.    1 3s t s t     3 1 ( 1,2,3)i ij j j x t a s t i    Fig.2
  • 6. mic1 mic2 Source 1 s1(t) Source 2 s2(t) y2(t) y1(t) ICA Solution of BSS ? Mixing Process Separation Process Independency degree Fig.3
  • 7. 7 2. Problem Formulation and Independence - Source signals (zero mean): sj (t), j=1~n - Recorded signals : xi (t), i=1~n - Linear Mixing Process (No delay model *) (* In real environment, the arrival time differences between microphones should be involved in the mixing model) Recover the sources sj (t), j=1~n from mixed signals xi (t) i=1~n - aij are unknown - We want to get both aij and si (t) (-aij , -si (t)) Under the following assumptions         1 1 1 ( 1 ) Vector-Marix form: = where ( constant matrix) , , , , , n i ij j j ij T T n n x t a s t i n a n n x x s s           x As A x s (4) (5)
  • 8. 8 Assumption 1: Source waveforms are statistically independent Assumption 2: The sources have non-Gaussian distribution Assumption 3: Matrix A is square and invertible (for simplicity) Estimated A is used to recover the original signals by inverse (de- mixing) operation. s=Bx where B=A-1 Possible Solutions by ICA -To ambiguities- - The Variance(amplitude) of recovered signals Because, if a pair of is the solution of the underlying BSS problem, then is also the other solution. Variance of source signals are assumed to be unity: - The order of recovered signals cannot be determined (Permutation)   ,ij ja s t 2 1jE s      1 ,ij jKa s t K       (6)
  • 9. Basics Independence and Uncorrelatedness Statistical independence of two random variables x and y Knowing the value of x does not provide information about the distribution of y . Example: Uncorrelatedness of two random variables x and y their covariance is zero, i.e. 9          , ,y y x y x yp y x p y p x y p x p y     1 0 - variables , , are uncorrelated then we have is diagonal matrix n T x E xy x x E     C xx Fig.4 (7)
  • 10. Statistical Independence     0x yE x m y m   Uncorrelatedness      x,y x yp x,y p x p y - Independence means uncorrelatedness - Gaussian density case Independence = Uncorrelatedness
  • 11. 11 Examples of Sources and Mixed Signals [Uniform densities (Sub-Gaussian)] - Two independent components s1, s2 that have the same uniform density - Mixing matrix        1 2 1 2 1 3 ,2 3 0 Variance of 1 , i i i s p s otherwise s p s s p s p s         1 2 1 1 2 2 5 10 where 10 2 5 10 10 2 x As A x s s s s                        a a 2a 1a Fig.5 Fig.6 (8)
  • 12. 12 [Super-Gaussian densities] - Two independents s1, s2 have super-Gaussian like Fig. ?? - - Mixing signals 2a 1a Super Gaussian Source signals joint distribution Mixed signals distribution Fig.8 Fig.7
  • 13. 13 3. Whitening(PCA) + ICA approach Observed signals This means y is also whitened signal. Conclusion: Whitening gives the independent components only up to an orthogonal transformation. whitening ICA x z s 1 2 T  z ED E x =Vx New Mixing matrix is Orthogonal matrix **( )   z Vx =VAs = As A VA Question*: Is this unique solution?     Ans* ( orthogonal matrix) T T T T y U E E   y = Uz C yy Uzz U UIU = I    ** T T T T E zz AE ss A AA I  
  • 14. 14 Why Gaussian variables are forbidden (Assumption 2) Suppose two independent signals s1 and s2 are Gaussian and we obtain Joint Gaussian distribution is invariant with respect to orthogonal transformation. This means that we cannot find (identify) orthogonal matrix A from the mixed data.   2 2 1 2 1 2 2 2 1 2 1 1 , exp exp 2 22 2 1 1 1 exp exp 2 2 2 2 T s s p s s s s                                 s s   1 1 1 2 ( =orthogonal matrix, ) 1 1 1 1 , exp exp 2 2 2 2 T T T T p z z                   z = As A A = A , s = A z z AA z z z The same density is observed = no information arises by the orthonormal transformation (9)
  • 15. 15 We need to answer the following two questions. 1) How can the non-Gaussianity of y is measured? 2) How can we compute the values of B that maximize the measure? Maximization of Non-Gaussianity For a given density p(x), we define a measure of non-Gaussianity NG(p(x)) (non-negative, =0 if p(x) is Gaussian ) NG [Intuitive interpretation of ICA as non-Gaussianity minimization] 0 Gaussian p(x) Non-Gaussian 1 1 is non-Gaussian : ( ) is unknown ( ( )) as a function of ( ) is maximuzed when s i T T n T T T i i x As s b s A x Bx x b y b x b As q s b q Ab p y b q y q                    NG (10) (11)
  • 16. 16 NG(py)0 y=q1s1 +q2s2 y=q1s1 y=q2s2 Reduce NG by mixing Maximization by alternating b: As y →qisi NG tends to be maximized 4. Measure of non-Gaussianity Kurtosis is a classical measure of non-Gaussianity The absolute value of the kurtosis can be used as s measure of non-Gaussian. Optimization problem is solved as an ICA solution.      2 4 2 ( ) : 3Kurt p y E y E y    0 Gaussian super-Gaussian  ( )Kurt p y sub-Gaussian      , : ( ) b Max J b J b Kurt p y (12) (13)