SlideShare ist ein Scribd-Unternehmen logo
1 von 30
Downloaden Sie, um offline zu lesen
Eigen Decomposition and
Singular Value Decomposition
Introduction
◼ Eigenvalue decomposition
❑ Spectral decomposition theorem
◼ Physical interpretation of eigenvalue/eigenvectors
◼ Singular Value Decomposition
◼ Importance of SVD
❑ Matrix inversion
❑ Solution to linear system of equations
❑ Solution to a homogeneous system of equations
◼ SVD application
What are eigenvalues?
◼ Given a matrix, A, x is the eigenvector and  is the
corresponding eigenvalue if Ax = x
❑ A must be square and the determinant of A -  I must be
equal to zero
Ax - x = 0 ! (A - I) x = 0
◼ Trivial solution is if x = 0
◼ The non trivial solution occurs when det(A - I) = 0
◼ Are eigenvectors are unique?
❑ If x is an eigenvector, then x is also an eigenvector and
 is an eigenvalue
A(x) = (Ax) = (x) = (x)
Calculating the Eigenvectors/values
◼ Expand the det(A - I) = 0 for a 2 x 2 matrix
◼ For a 2 x 2 matrix, this is a simple quadratic equation with two solutions
(maybe complex)
◼ This “characteristic equation” can be used to solve for x
( )
( )( )
( ) ( ) 0
0
0
det
0
1
0
0
1
det
det
21
12
22
11
22
11
2
21
12
22
11
22
21
12
11
22
21
12
11
=
−
+
+
−
=
−
−
−

=






−
−
=














−






=
−
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
I
A








( ) ( )
( )
21
12
22
11
2
22
11
22
11
4 a
a
a
a
a
a
a
a
−
+

+
=

Eigenvalue example
◼ Consider,
◼ The corresponding eigenvectors can be computed as
❑ For  = 0, one possible solution is x = (2, -1)
❑ For  = 5, one possible solution is x = (1, 2)
( ) ( )
( )





=
=

+
=
=

−

+
+
−
=
−
+
+
−







=
5
,
0
)
4
1
(
0
2
2
4
1
)
4
1
(
0
4
2
2
1
2
2
21
12
22
11
22
11
2







 a
a
a
a
a
a
A






=






−
+
−
=













−
−

=



















−







=






=






+
+
=














=



















−







=
0
0
1
2
2
4
1
2
2
4
0
5
0
0
5
4
2
2
1
5
0
0
4
2
2
1
4
2
2
1
0
0
0
0
0
4
2
2
1
0
y
x
y
x
y
x
y
x
y
x
y
x
y
x
y
x


Physical interpretation
◼ Consider a covariance matrix, A, i.e., A = 1/n S ST for some S
◼ Error ellipse with the major axis as the larger eigenvalue and
the minor axis as the smaller eigenvalue
25
.
0
,
75
.
1
1
75
.
75
.
1
2
1 =
=







= 

A
Physical interpretation
◼ Orthogonal directions of greatest variance in data
◼ Projections along PC1 (Principal Component) discriminate the data most
along any one axis
Original Variable A
Original
Variable
B
PC 1
PC 2
Physical interpretation
◼ First principal component is the direction of
greatest variability (covariance) in the data
◼ Second is the next orthogonal (uncorrelated)
direction of greatest variability
❑ So first remove all the variability along the first
component, and then find the next direction of greatest
variability
◼ And so on …
◼ Thus each eigenvectors provides the directions of
data variances in decreasing order of eigenvalues
Multivariate Gaussian
Bivariate Gaussian
Spherical, diagonal, full covariance
◼ Let be a square matrix with m
linearly independent eigenvectors (a “non-
defective” matrix)
◼ Theorem: Exists an eigen decomposition
❑ (cf. matrix diagonalization theorem)
◼ Columns of U are eigenvectors of S
◼ Diagonal elements of are eigenvalues of
Eigen/diagonal Decomposition
diagonal
Unique
for
distinct
eigen-
values
Diagonal decomposition: why/how










= n
v
v
U ...
1
Let U have the eigenvectors as columns:




















=










=










=
n
n
n
n
n v
v
v
v
v
v
S
SU



 ...
...
...
...
1
1
1
1
1
Then, SU can be written
And S=UU–1.
Thus SU=U, or U–1SU=
Diagonal decomposition - example
Recall .
3
,
1
;
2
1
1
2
2
1 =
=






= 

S
The eigenvectors and form








−1
1








1
1






−
=
1
1
1
1
U
Inverting, we have 




 −
=
−
2
/
1
2
/
1
2
/
1
2
/
1
1
U
Then, S=UU–1 = 




 −












− 2
/
1
2
/
1
2
/
1
2
/
1
3
0
0
1
1
1
1
1
Recall
UU–1 =1.
Example continued
Let’s divide U (and multiply U–1) by 2





 −












− 2
/
1
2
/
1
2
/
1
2
/
1
3
0
0
1
2
/
1
2
/
1
2
/
1
2
/
1
Then, S=
Q (Q-1= QT )

◼ If is a symmetric matrix:
◼ Theorem: Exists a (unique) eigen
decomposition
◼ where Q is orthogonal:
❑ Q-1= QT
❑ Columns of Q are normalized eigenvectors
❑ Columns are orthogonal.
❑ (everything is real)
Symmetric Eigen Decomposition
T
Q
Q
S 
=
Spectral Decomposition theorem
◼ If A is a symmetric and positive definite k x k matrix (xTAx
> 0) with i (i > 0) and ei, i = 1  k being the k
eigenvector and eigenvalue pairs, then
❑ This is also called the eigen decomposition theorem
◼ Any symmetric matrix can be reconstructed using its
eigenvalues and eigenvectors
( ) ( )( ) ( )( ) ( )( ) ( ) ( )( )
T
k
i k
T
i
k
i
i
k
k
k
T
k
k
k
k
k
T
k
k
T
k
k
k
P
P
e
e
A
e
e
e
e
e
e
A 
=
=

+
+
= 
= 









1 1
1
1
1
1
2
1
2
2
1
1
1
1
1 


 
( )
  ( )












=

=


k
k
k
k
k
k











0
0
0
0
0
0
,
, 2
1
2
1 e
e
e
P
Example for spectral decomposition
◼ Let A be a symmetric, positive definite matrix
◼ The eigenvectors for the corresponding eigenvalues
are
◼ Consequently,
( )
( ) ( )( ) 0
2
3
16
.
0
16
.
6
5
0
det
8
.
2
4
.
0
4
.
0
2
.
2
2
=
−
−
=
−
+
−

=
−







=




I
A
A





 −
=






=
5
1
,
5
2
,
5
2
,
5
1
2
1
T
T
e
e






−
−
+






=





 −










−
+
















=






=
4
.
0
8
.
0
8
.
0
6
.
1
4
.
2
2
.
1
2
.
1
6
.
0
5
1
5
2
5
1
5
2
2
5
2
5
1
5
2
5
1
3
8
.
2
4
.
0
4
.
0
2
.
2
A
Singular Value Decomposition
◼ If A is a rectangular m x k matrix of real numbers, then there exists an m x
m orthogonal matrix U and a k x k orthogonal matrix V such that
❑  is an m x k matrix where the (i, j)th entry i ¸ 0, i = 1  min(m, k) and the
other entries are zero
◼ The positive constants i are the singular values of A
◼ If A has rank r, then there exists r positive constants 1, 2,r, r
orthogonal m x 1 unit vectors u1,u2,,ur and r orthogonal k x 1 unit
vectors v1,v2,,vr such that
❑ Similar to the spectral decomposition theorem
( ) ( )( )( )
I
VV
UU
V
U
A =
=

=




T
T
k
k
T
k
m
m
m
k
m

=
=
r
i
T
i
i
i
1
v
u
A 
Singular Value Decomposition (contd.)
◼ If A is a symmetric and positive definite
then
❑ SVD = Eigen decomposition
◼ EIG(i) = SVD(i
2)
◼ Here AAT has an eigenvalue-eigenvector
pair (i
2,ui)
◼ Alternatively, the vi are the eigenvectors of
ATA with the same non zero eigenvalue i
2
T
T
V
V
A
A 2

=
( )( )
T
T
T
T
T
T
T
U
U
U
V
V
U
V
U
V
U
AA
2

=


=


=
Example for SVD
◼ Let A be a symmetric, positive definite matrix
❑ U can be computed as
❑ V can be computed as
( ) 




 −
=






=

=
=

=
−






=









 −






−
=







−
=
2
1
,
2
1
,
2
1
,
2
1
10
,
12
0
det
11
1
1
11
1
1
3
1
1
3
1
3
1
1
1
3
1
3
1
1
1
3
2
1
2
1
T
T
T
T
u
u
I
AA
AA
A



( )





 −
=





 −
=






=

=
=
=

=
−










=






−









 −
=







−
=
30
5
,
30
2
,
30
1
,
0
,
5
1
,
5
2
,
6
1
,
6
2
,
6
1
0
,
10
,
12
0
det
2
4
2
4
10
0
2
0
10
1
3
1
1
1
3
1
1
3
1
1
3
1
3
1
1
1
3
3
2
1
3
2
1
T
T
T
T
T
v
v
v
I
A
A
A
A
A




Example for SVD
◼ Taking 2
1=12 and 2
2=10, the singular value
decomposition of A is
◼ Thus the U, V and  are computed by performing eigen
decomposition of AAT and ATA
◼ Any matrix has a singular value decomposition but only
symmetric, positive definite matrices have an eigen
decomposition





 −










−
+
















=






−
=
0
,
5
1
,
5
2
2
1
2
1
10
6
1
,
6
2
,
6
1
2
1
2
1
12
1
3
1
1
1
3
A
Applications of SVD in Linear Algebra
◼ Inverse of a n x n square matrix, A
❑ If A is non-singular, then A-1 = (UVT)-1= V-1UT where
-1=diag(1/1, 1/1,, 1/n)
❑ If A is singular, then A-1 = (UVT)-1= V0
-1UT where
0
-1=diag(1/1, 1/2,, 1/i,0,0,,0)
◼ Least squares solutions of a mxn system
❑ Ax=b (A is mxn, m¸n) =(ATA)x=ATb ) x=(ATA)-1 ATb=A+b
❑ If ATA is singular, x=A+b¼ (V0
-1UT)b where 0
-1 = diag(1/1,
1/2,, 1/i,0,0,,0)
◼ Condition of a matrix
❑ Condition number measures the degree of singularity of A
◼ Larger the value of 1/n, closer A is to being singular
Applications of SVD in Linear Algebra
◼ Homogeneous equations, Ax = 0
❑ Minimum-norm solution is x=0
(trivial solution)
❑ Impose a constraint,
❑ “Constrained” optimization
problem
❑ Special Case
◼ If rank(A)=n-1 (m ¸ n-1, n=0)
then x= vn ( is a constant)
❑ Genera Case
◼ If rank(A)=n-k (m ¸ n-k, n-
k+1== n=0) then x=1vn-
k+1++kvn with 2
1++2
n=1
For proof: Johnson and Wichern, “Applied Multivariate Statistical Analysis”, pg 79
Ax
x 1
min =
1
=
x
◼ Has appeared before
❑ Homogeneous solution of a linear
system of equations
❑ Computation of Homogrpahy
using DLT
❑ Estimation of Fundamental matrix
What is the use of SVD?
◼ SVD can be used to compute
optimal low-rank approximations
of arbitrary matrices.
◼ Face recognition
❑ Represent the face images as
eigenfaces and compute distance
between the query face image in the
principal component space
◼ Data mining
❑ Latent Semantic Indexing for
document extraction
◼ Image compression
❑ Karhunen Loeve (KL) transform
performs the best image
compression
◼ In MPEG, Discrete Cosine
Transform (DCT) has the closest
approximation to the KL transform
in PSNR
Singular Value Decomposition
◼ Illustration of SVD dimensions and
sparseness
SVD example
Let









 −
=
0
1
1
0
1
1
A
Thus m=3, n=2. Its SVD is






−




















−
−
2
/
1
2
/
1
2
/
1
2
/
1
0
0
3
0
0
1
3
/
1
6
/
1
2
/
1
3
/
1
6
/
1
2
/
1
3
/
1
6
/
2
0
Typically, the singular values arranged in decreasing order.
◼ SVD can be used to compute optimal low-
rank approximations.
◼ Approximation problem: Find Ak of rank k
such that
◼ Ak and X are both mn matrices.
Typically, want k << r.
Low-rank Approximation
Frobenius norm
F
k
X
rank
X
k X
A
A −
=
=
min)
(
:
◼ Solution via SVD
Low-rank Approximation
set smallest r-k
singular values to zero
T
k
k V
U
A )
0
,...,
0
,
,...,
(
diag 1 

=
column notation: sum
of rank 1 matrices
T
i
i
k
i i
k v
u
A =
= 1

k
Approximation error
◼ How good (bad) is this approximation?
◼ It’s the best possible, measured by the
Frobenius norm of the error:
where the i are ordered such that i  i+1.
Suggests why Frobenius error drops as k
increased.
1
)
(
:
min +
=
=
−
=
− k
F
k
F
k
X
rank
X
A
A
X
A 

Weitere ähnliche Inhalte

Ähnlich wie SVD-1 (1).pdf

Eigen values and eigen vectors engineering
Eigen values and eigen vectors engineeringEigen values and eigen vectors engineering
Eigen values and eigen vectors engineeringshubham211
 
Eigenvalue eigenvector slides
Eigenvalue eigenvector slidesEigenvalue eigenvector slides
Eigenvalue eigenvector slidesAmanSaeed11
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Fabian Pedregosa
 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectorsRiddhi Patel
 
Series solutions at ordinary point and regular singular point
Series solutions at ordinary point and regular singular pointSeries solutions at ordinary point and regular singular point
Series solutions at ordinary point and regular singular pointvaibhav tailor
 
Lesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsLesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsMatthew Leingang
 
Optimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methodsOptimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methodsSantiagoGarridoBulln
 
Tutorial on EM algorithm – Part 1
Tutorial on EM algorithm – Part 1Tutorial on EM algorithm – Part 1
Tutorial on EM algorithm – Part 1Loc Nguyen
 
On Fully Indecomposable Quaternion Doubly Stochastic Matrices
On Fully Indecomposable Quaternion Doubly Stochastic MatricesOn Fully Indecomposable Quaternion Doubly Stochastic Matrices
On Fully Indecomposable Quaternion Doubly Stochastic Matricesijtsrd
 
Module 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdfModule 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdfPrathamPatel560716
 
Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Fabian Pedregosa
 

Ähnlich wie SVD-1 (1).pdf (20)

Eigen values and eigen vectors engineering
Eigen values and eigen vectors engineeringEigen values and eigen vectors engineering
Eigen values and eigen vectors engineering
 
Eigenvalue eigenvector slides
Eigenvalue eigenvector slidesEigenvalue eigenvector slides
Eigenvalue eigenvector slides
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3
 
eigenvalue
eigenvalueeigenvalue
eigenvalue
 
co.pptx
co.pptxco.pptx
co.pptx
 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectors
 
Series solutions at ordinary point and regular singular point
Series solutions at ordinary point and regular singular pointSeries solutions at ordinary point and regular singular point
Series solutions at ordinary point and regular singular point
 
Lesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsLesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear Equations
 
Linear Algebra
Linear AlgebraLinear Algebra
Linear Algebra
 
Optimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methodsOptimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methods
 
Tutorial on EM algorithm – Part 1
Tutorial on EM algorithm – Part 1Tutorial on EM algorithm – Part 1
Tutorial on EM algorithm – Part 1
 
On Fully Indecomposable Quaternion Doubly Stochastic Matrices
On Fully Indecomposable Quaternion Doubly Stochastic MatricesOn Fully Indecomposable Quaternion Doubly Stochastic Matrices
On Fully Indecomposable Quaternion Doubly Stochastic Matrices
 
Module 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdfModule 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdf
 
lec19.ppt
lec19.pptlec19.ppt
lec19.ppt
 
Note.pdf
Note.pdfNote.pdf
Note.pdf
 
lec10.ppt
lec10.pptlec10.ppt
lec10.ppt
 
lec17.ppt
lec17.pptlec17.ppt
lec17.ppt
 
lec23.ppt
lec23.pptlec23.ppt
lec23.ppt
 
Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2
 
Ch07 7
Ch07 7Ch07 7
Ch07 7
 

Kürzlich hochgeladen

Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfPrerana Jadhav
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmStan Meyer
 
Textual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSTextual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSMae Pangan
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
 
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...Nguyen Thanh Tu Collection
 
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
Unraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptxUnraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptx
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptxDhatriParmar
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17Celine George
 
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxDIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxMichelleTuguinay1
 
4.9.24 School Desegregation in Boston.pptx
4.9.24 School Desegregation in Boston.pptx4.9.24 School Desegregation in Boston.pptx
4.9.24 School Desegregation in Boston.pptxmary850239
 
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Association for Project Management
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Projectjordimapav
 
Sulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesSulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesVijayaLaxmi84
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptxmary850239
 
ARTERIAL BLOOD GAS ANALYSIS........pptx
ARTERIAL BLOOD  GAS ANALYSIS........pptxARTERIAL BLOOD  GAS ANALYSIS........pptx
ARTERIAL BLOOD GAS ANALYSIS........pptxAneriPatwari
 
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...DhatriParmar
 
CLASSIFICATION OF ANTI - CANCER DRUGS.pptx
CLASSIFICATION OF ANTI - CANCER DRUGS.pptxCLASSIFICATION OF ANTI - CANCER DRUGS.pptx
CLASSIFICATION OF ANTI - CANCER DRUGS.pptxAnupam32727
 

Kürzlich hochgeladen (20)

Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdf
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and Film
 
Textual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSTextual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHS
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
 
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
 
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
Unraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptxUnraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptx
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17
 
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxDIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
 
Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"
 
4.9.24 School Desegregation in Boston.pptx
4.9.24 School Desegregation in Boston.pptx4.9.24 School Desegregation in Boston.pptx
4.9.24 School Desegregation in Boston.pptx
 
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Project
 
Sulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesSulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their uses
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx
 
ARTERIAL BLOOD GAS ANALYSIS........pptx
ARTERIAL BLOOD  GAS ANALYSIS........pptxARTERIAL BLOOD  GAS ANALYSIS........pptx
ARTERIAL BLOOD GAS ANALYSIS........pptx
 
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
 
CLASSIFICATION OF ANTI - CANCER DRUGS.pptx
CLASSIFICATION OF ANTI - CANCER DRUGS.pptxCLASSIFICATION OF ANTI - CANCER DRUGS.pptx
CLASSIFICATION OF ANTI - CANCER DRUGS.pptx
 

SVD-1 (1).pdf

  • 1. Eigen Decomposition and Singular Value Decomposition
  • 2. Introduction ◼ Eigenvalue decomposition ❑ Spectral decomposition theorem ◼ Physical interpretation of eigenvalue/eigenvectors ◼ Singular Value Decomposition ◼ Importance of SVD ❑ Matrix inversion ❑ Solution to linear system of equations ❑ Solution to a homogeneous system of equations ◼ SVD application
  • 3. What are eigenvalues? ◼ Given a matrix, A, x is the eigenvector and  is the corresponding eigenvalue if Ax = x ❑ A must be square and the determinant of A -  I must be equal to zero Ax - x = 0 ! (A - I) x = 0 ◼ Trivial solution is if x = 0 ◼ The non trivial solution occurs when det(A - I) = 0 ◼ Are eigenvectors are unique? ❑ If x is an eigenvector, then x is also an eigenvector and  is an eigenvalue A(x) = (Ax) = (x) = (x)
  • 4. Calculating the Eigenvectors/values ◼ Expand the det(A - I) = 0 for a 2 x 2 matrix ◼ For a 2 x 2 matrix, this is a simple quadratic equation with two solutions (maybe complex) ◼ This “characteristic equation” can be used to solve for x ( ) ( )( ) ( ) ( ) 0 0 0 det 0 1 0 0 1 det det 21 12 22 11 22 11 2 21 12 22 11 22 21 12 11 22 21 12 11 = − + + − = − − −  =       − − =               −       = − a a a a a a a a a a a a a a a a a a I A         ( ) ( ) ( ) 21 12 22 11 2 22 11 22 11 4 a a a a a a a a − +  + = 
  • 5. Eigenvalue example ◼ Consider, ◼ The corresponding eigenvectors can be computed as ❑ For  = 0, one possible solution is x = (2, -1) ❑ For  = 5, one possible solution is x = (1, 2) ( ) ( ) ( )      = =  + = =  −  + + − = − + + −        = 5 , 0 ) 4 1 ( 0 2 2 4 1 ) 4 1 ( 0 4 2 2 1 2 2 21 12 22 11 22 11 2         a a a a a a A       =       − + − =              − −  =                    −        =       =       + + =               =                    −        = 0 0 1 2 2 4 1 2 2 4 0 5 0 0 5 4 2 2 1 5 0 0 4 2 2 1 4 2 2 1 0 0 0 0 0 4 2 2 1 0 y x y x y x y x y x y x y x y x  
  • 6. Physical interpretation ◼ Consider a covariance matrix, A, i.e., A = 1/n S ST for some S ◼ Error ellipse with the major axis as the larger eigenvalue and the minor axis as the smaller eigenvalue 25 . 0 , 75 . 1 1 75 . 75 . 1 2 1 = =        =   A
  • 7. Physical interpretation ◼ Orthogonal directions of greatest variance in data ◼ Projections along PC1 (Principal Component) discriminate the data most along any one axis Original Variable A Original Variable B PC 1 PC 2
  • 8. Physical interpretation ◼ First principal component is the direction of greatest variability (covariance) in the data ◼ Second is the next orthogonal (uncorrelated) direction of greatest variability ❑ So first remove all the variability along the first component, and then find the next direction of greatest variability ◼ And so on … ◼ Thus each eigenvectors provides the directions of data variances in decreasing order of eigenvalues
  • 12. ◼ Let be a square matrix with m linearly independent eigenvectors (a “non- defective” matrix) ◼ Theorem: Exists an eigen decomposition ❑ (cf. matrix diagonalization theorem) ◼ Columns of U are eigenvectors of S ◼ Diagonal elements of are eigenvalues of Eigen/diagonal Decomposition diagonal Unique for distinct eigen- values
  • 13. Diagonal decomposition: why/how           = n v v U ... 1 Let U have the eigenvectors as columns:                     =           =           = n n n n n v v v v v v S SU     ... ... ... ... 1 1 1 1 1 Then, SU can be written And S=UU–1. Thus SU=U, or U–1SU=
  • 14. Diagonal decomposition - example Recall . 3 , 1 ; 2 1 1 2 2 1 = =       =   S The eigenvectors and form         −1 1         1 1       − = 1 1 1 1 U Inverting, we have       − = − 2 / 1 2 / 1 2 / 1 2 / 1 1 U Then, S=UU–1 =       −             − 2 / 1 2 / 1 2 / 1 2 / 1 3 0 0 1 1 1 1 1 Recall UU–1 =1.
  • 15. Example continued Let’s divide U (and multiply U–1) by 2       −             − 2 / 1 2 / 1 2 / 1 2 / 1 3 0 0 1 2 / 1 2 / 1 2 / 1 2 / 1 Then, S= Q (Q-1= QT ) 
  • 16. ◼ If is a symmetric matrix: ◼ Theorem: Exists a (unique) eigen decomposition ◼ where Q is orthogonal: ❑ Q-1= QT ❑ Columns of Q are normalized eigenvectors ❑ Columns are orthogonal. ❑ (everything is real) Symmetric Eigen Decomposition T Q Q S  =
  • 17. Spectral Decomposition theorem ◼ If A is a symmetric and positive definite k x k matrix (xTAx > 0) with i (i > 0) and ei, i = 1  k being the k eigenvector and eigenvalue pairs, then ❑ This is also called the eigen decomposition theorem ◼ Any symmetric matrix can be reconstructed using its eigenvalues and eigenvectors ( ) ( )( ) ( )( ) ( )( ) ( ) ( )( ) T k i k T i k i i k k k T k k k k k T k k T k k k P P e e A e e e e e e A  = =  + + =  =           1 1 1 1 1 1 2 1 2 2 1 1 1 1 1      ( )   ( )             =  =   k k k k k k            0 0 0 0 0 0 , , 2 1 2 1 e e e P
  • 18. Example for spectral decomposition ◼ Let A be a symmetric, positive definite matrix ◼ The eigenvectors for the corresponding eigenvalues are ◼ Consequently, ( ) ( ) ( )( ) 0 2 3 16 . 0 16 . 6 5 0 det 8 . 2 4 . 0 4 . 0 2 . 2 2 = − − = − + −  = −        =     I A A       − =       = 5 1 , 5 2 , 5 2 , 5 1 2 1 T T e e       − − +       =       −           − +                 =       = 4 . 0 8 . 0 8 . 0 6 . 1 4 . 2 2 . 1 2 . 1 6 . 0 5 1 5 2 5 1 5 2 2 5 2 5 1 5 2 5 1 3 8 . 2 4 . 0 4 . 0 2 . 2 A
  • 19. Singular Value Decomposition ◼ If A is a rectangular m x k matrix of real numbers, then there exists an m x m orthogonal matrix U and a k x k orthogonal matrix V such that ❑  is an m x k matrix where the (i, j)th entry i ¸ 0, i = 1  min(m, k) and the other entries are zero ◼ The positive constants i are the singular values of A ◼ If A has rank r, then there exists r positive constants 1, 2,r, r orthogonal m x 1 unit vectors u1,u2,,ur and r orthogonal k x 1 unit vectors v1,v2,,vr such that ❑ Similar to the spectral decomposition theorem ( ) ( )( )( ) I VV UU V U A = =  =     T T k k T k m m m k m  = = r i T i i i 1 v u A 
  • 20. Singular Value Decomposition (contd.) ◼ If A is a symmetric and positive definite then ❑ SVD = Eigen decomposition ◼ EIG(i) = SVD(i 2) ◼ Here AAT has an eigenvalue-eigenvector pair (i 2,ui) ◼ Alternatively, the vi are the eigenvectors of ATA with the same non zero eigenvalue i 2 T T V V A A 2  = ( )( ) T T T T T T T U U U V V U V U V U AA 2  =   =   =
  • 21. Example for SVD ◼ Let A be a symmetric, positive definite matrix ❑ U can be computed as ❑ V can be computed as ( )       − =       =  = =  = −       =           −       − =        − = 2 1 , 2 1 , 2 1 , 2 1 10 , 12 0 det 11 1 1 11 1 1 3 1 1 3 1 3 1 1 1 3 1 3 1 1 1 3 2 1 2 1 T T T T u u I AA AA A    ( )       − =       − =       =  = = =  = −           =       −           − =        − = 30 5 , 30 2 , 30 1 , 0 , 5 1 , 5 2 , 6 1 , 6 2 , 6 1 0 , 10 , 12 0 det 2 4 2 4 10 0 2 0 10 1 3 1 1 1 3 1 1 3 1 1 3 1 3 1 1 1 3 3 2 1 3 2 1 T T T T T v v v I A A A A A    
  • 22. Example for SVD ◼ Taking 2 1=12 and 2 2=10, the singular value decomposition of A is ◼ Thus the U, V and  are computed by performing eigen decomposition of AAT and ATA ◼ Any matrix has a singular value decomposition but only symmetric, positive definite matrices have an eigen decomposition       −           − +                 =       − = 0 , 5 1 , 5 2 2 1 2 1 10 6 1 , 6 2 , 6 1 2 1 2 1 12 1 3 1 1 1 3 A
  • 23. Applications of SVD in Linear Algebra ◼ Inverse of a n x n square matrix, A ❑ If A is non-singular, then A-1 = (UVT)-1= V-1UT where -1=diag(1/1, 1/1,, 1/n) ❑ If A is singular, then A-1 = (UVT)-1= V0 -1UT where 0 -1=diag(1/1, 1/2,, 1/i,0,0,,0) ◼ Least squares solutions of a mxn system ❑ Ax=b (A is mxn, m¸n) =(ATA)x=ATb ) x=(ATA)-1 ATb=A+b ❑ If ATA is singular, x=A+b¼ (V0 -1UT)b where 0 -1 = diag(1/1, 1/2,, 1/i,0,0,,0) ◼ Condition of a matrix ❑ Condition number measures the degree of singularity of A ◼ Larger the value of 1/n, closer A is to being singular
  • 24. Applications of SVD in Linear Algebra ◼ Homogeneous equations, Ax = 0 ❑ Minimum-norm solution is x=0 (trivial solution) ❑ Impose a constraint, ❑ “Constrained” optimization problem ❑ Special Case ◼ If rank(A)=n-1 (m ¸ n-1, n=0) then x= vn ( is a constant) ❑ Genera Case ◼ If rank(A)=n-k (m ¸ n-k, n- k+1== n=0) then x=1vn- k+1++kvn with 2 1++2 n=1 For proof: Johnson and Wichern, “Applied Multivariate Statistical Analysis”, pg 79 Ax x 1 min = 1 = x ◼ Has appeared before ❑ Homogeneous solution of a linear system of equations ❑ Computation of Homogrpahy using DLT ❑ Estimation of Fundamental matrix
  • 25. What is the use of SVD? ◼ SVD can be used to compute optimal low-rank approximations of arbitrary matrices. ◼ Face recognition ❑ Represent the face images as eigenfaces and compute distance between the query face image in the principal component space ◼ Data mining ❑ Latent Semantic Indexing for document extraction ◼ Image compression ❑ Karhunen Loeve (KL) transform performs the best image compression ◼ In MPEG, Discrete Cosine Transform (DCT) has the closest approximation to the KL transform in PSNR
  • 26. Singular Value Decomposition ◼ Illustration of SVD dimensions and sparseness
  • 27. SVD example Let           − = 0 1 1 0 1 1 A Thus m=3, n=2. Its SVD is       −                     − − 2 / 1 2 / 1 2 / 1 2 / 1 0 0 3 0 0 1 3 / 1 6 / 1 2 / 1 3 / 1 6 / 1 2 / 1 3 / 1 6 / 2 0 Typically, the singular values arranged in decreasing order.
  • 28. ◼ SVD can be used to compute optimal low- rank approximations. ◼ Approximation problem: Find Ak of rank k such that ◼ Ak and X are both mn matrices. Typically, want k << r. Low-rank Approximation Frobenius norm F k X rank X k X A A − = = min) ( :
  • 29. ◼ Solution via SVD Low-rank Approximation set smallest r-k singular values to zero T k k V U A ) 0 ,..., 0 , ,..., ( diag 1   = column notation: sum of rank 1 matrices T i i k i i k v u A = = 1  k
  • 30. Approximation error ◼ How good (bad) is this approximation? ◼ It’s the best possible, measured by the Frobenius norm of the error: where the i are ordered such that i  i+1. Suggests why Frobenius error drops as k increased. 1 ) ( : min + = = − = − k F k F k X rank X A A X A 