Although eigenvalues are one of the most important concepts in linear algebra, some of us eigen-struggle with them without understanding their usefulness and beauty. In this talk I'll briefly review the definition of eigenvalues emphasizing the associated geometric idea and I'll show how can they be used in some applications.
From the Un-Distinguished Lecture Series (http://ws.cs.ubc.ca/~udls/). The talk was given Mar. 16, 2007
Handwritten Text Recognition for manuscripts and early printed texts
Eigenvalues in a Nutshell
1. Eigenvalues in a nutshell
Eigenvalues in a nutshell
Mariquita Flores Garrido
UDLS, March 16th 2007
2. Just in case…
• Scalar multiple of a vector
λx
x
x x
x
λx
λx λx
0 ≤ λ ≤1 1≤ λ −1 ≤ λ ≤ 0 λ ≤ −1
• Addition of vectors
v1 v1 + v2
v2
3. Linear Transformations
Ax = b Transformation of x by A.
• Rectangular matrices
A ∈ R m×n ⇒ f : R n a R m
A x = Ax
mxn mx1
nx1
V. gr.
⎛1 4⎞ ⎛5⎞
⎜ ⎟ ⎛1⎞ ⎜ ⎟
⎜2 5⎟ ⎜ ⎟ = ⎜7⎟
⎜1⎟
⎜3 6⎟ ⎝ ⎠ ⎜9⎟
⎝ ⎠ ⎝ ⎠
4. Linear Transformations
• Square Matrices A ∈ R n×n ⇒ f : R n a R n (*endomorphism)
*Stretch/Compression *Rotation *Reflection
⎛ 2 0⎞ ⎛ cos ϕ sin ϕ ⎞ ⎛0 1⎞
⎜ 0 2⎟
⎜ ⎟ ⎜
⎜ − sin ϕ ⎟ ⎜
⎜1 0⎟
⎝ ⎠ ⎝ cos ϕ ⎟
⎠ ⎝
⎟
⎠
5. Bonnus: Shear
*Shear in x-direction *Shear in y-direction
⎛1 k ⎞ ⎛ 1 0⎞
⎜
⎜0 1⎟⎟ ⎜
⎜ k 1⎟
⎟
⎝ ⎠ ⎝ ⎠
V.gr. Shear in x-direction
y ⎛ x⎞ y ⎛ x + ky ⎞
⎜ ⎟
⎜ y⎟ ⎜
⎜ y ⎟ ⎟
⎝ ⎠ ⎝ ⎠
x x
6. Basis for a Subspace
A basis in Rn is a set of n linearly independent vectors.
⎛1⎞ 2e3
⎜ ⎟
⎜1⎟
⎜ 2⎟
⎝ ⎠
e3
e2
⎛1⎞ ⎛1⎞ ⎛0⎞ ⎛0⎞
⎜ ⎟ ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ e1
⎜1⎟ = 1 ⎜0⎟ + 1 ⎜1⎟ + 2 ⎜0⎟
⎜2⎟ ⎜0⎟ ⎜0⎟ ⎜1⎟
⎝ ⎠ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠
7. Basis for a Subspace
Any set of n linearly independent vectors can be a basis
V2
Using canonical
basis:
⎛ a1 ⎞
⎜ ⎟
⎜a ⎟
⎝ 2⎠ e2 ⎛ a1 ⎞ ⎛ − 2 ⎞
V1 ⎜ ⎟=⎜ ⎟
⎜a ⎟ ⎜ 1 ⎟
e1 ⎝ 2⎠ ⎝ ⎠
V2
Using V1, V2 … ? V1
⎛ a1 ⎞
⎜ ⎟ = ??
⎜a ⎟
⎝ 2⎠
8. EIGENVALUES
•quot;Eigenquot; - quot;ownquot;, quot;peculiar toquot;, quot;characteristicquot; or quot;individual“; quot;proper
value“.
• An invariant subspace under an endomorphism.
• If A is n x n matrix, x ≠ 0 is called an eigenvector of A if
Ax = λx
and λ is called an eigenvalue of A.
10. Eigen – slang
• Characteristic polynomial: A degree n polynomial in λ:
det(λI - A) = 0
Scalars satisfying the eqn, are the eigenvalues of A.
V.gr.
⎛1 2⎞ 1− λ 2
⎜ ⎟⎯
⎜3 4⎟ ⎯→ = λ2 − 5λ − 2 = 0
⎝ ⎠ 3 4−λ
• Spectrum (of A) : { λ1, λ2 , …, λn}
• Algebraic multiplicity (of λi): number of roots equal to λi.
• Eigenspace (of λi): Eigenvectors never come alone!
Ax = λx
k ⋅ Ax = k ⋅ λx
A(kx) = λ (kx)
• Geometric multiplicity (of λi): number of lin. independent eigenvectors
associated with λi.
11. Eigen – slang
• Eigen – something: Something that doesn’t change under some
transformation.
d [e x ]
= ex
dx
12. FAQ (yeah, sure)
• How old are the eigenvalues?
They arose before matrix theory, in the context of differential equations.
Bernoulli, Euler, 18th Century.
Hilbert, 20th century.
• Do all matrices have eigenvalues?
Yes. Every n x n matrix has n eigenvalues.
13. • Why are the eigenvalues important?
- Physical meaning (v.gr. string, molecular orbitals ).
- There are other concepts relying on eigenvalues (v.gr. singular values, condition number).
- They tell almost everything about a matrix.
14. Properties of a matrix reflected in its eigenvalues:
1. A singular ↔ λ = 0.
2. A and AT have the same λ’s.
3. A symmetric Real λ’s..
4. A skew-symmetric Imaginary λ’s..
5. A symmetric positive definite λ’s > 0
6. A full rank Eigenvectors form a basis for Rn.
7. A symmetric Eigenvectors can be chosen orthonormal.
8. A real Eigenvalues and eigenvectors come in conjugate pairs.
9. A symmetric Number of positive eigenvalues equals the number of
positive pivots. A diagonal λi = aii
15. Properties of a matrix reflected in its eigenvalues:
10. A and M-1AM have the same λ’s.
11. A orthogonal all |λ | = 1
12. A projector λ = 1,0
13. A Markov λmax = 1
14. A reflection λ = -1,1,…,1
15. A rank one λ = vTu
16. A-1 1/λ(A)
17. A + cI λ(A) + c
18. A diagonal λi = aii
19. Eigenvectors of AAT Basis for Col(A)
20. Eigenvectors of ATA Basis for Row(A)
M
16. What’s the worst thing about eigenvalues?
Find them is painful; they are roots of the characteristic polynomial.
* How long does it take to calculate the determinant of a
25 x 25 matrix?
* How do we find roots of polynomials?
17. WARNING:
The following examples have been
simplified to be presented in a short
talk about eigenvalues. Attendee
discretion is advised.
18. Example 1: Face Identification
Eigenfaces: face identification technique.
(There are also eigeneyes, eigennoses, eigenmouths, eigenears,eigenvoices,…)
19. EIGENFACES
Given a set of images, and a
“target face”, identify the
“owner” of the face.
256 x 256
(test)
128 images
(train set)
20. 1. Preprocessing stage: linear transformations, morphing,
warping,…
2. Representing faces: vectors (Γj) in a very high dimensional
space.
V.gr.
Training set: 65536 x 128 matrix
3. Centering data: take the “average” image and define every Φj
Φ j = Ψ − Γj
1 n
A = [Φ1, Φ2 ,...,Φn ]
Ψ = ∑ Γj
n j =1
21. 4. Eigenvectors of AAT are a basis for Col(A) (what’s the size of this matrix?), so
instead of working with A, I can express every image in another basis.
* 5. PCA: reducing the dimension of the space. To solve the problem, the work is
done in a smaller subspace, SL, using projections of each image onto SL.
6. It’s possible to get eigenvectors of AAT using eigenvectors of ATA.
65436 x 65436 128 x 128
23. ITERATIVE METHODS
Âx=b
• Gauss-Jordan
• If  is 105 ×105 , Gauss Jordan would take approx. 290 years.
• Iterative methods: find some “good” matrix A and apply it to some
initial vector until you get convergence.
• Choosing different A determines different methods (v.gr. Jacobi,
Gauss-Seidel, Krylov subspace methods, …).
24. Example 2: ITERATIVE METHODS
• Iteration
x1 = Ax 0
A: huge matrix ( 106 ×106 )
x 2 = Ax1 = A(Ax 0 ) = A 2 x 0
x0 : initial guess
M
xn = An x0
• If A has full rank, its eigenvectors form a basis for Rm
An x0 = An (α1v1 + α 2 v2 + L + α m vm )
= α1 An v1 + α 2 An v2 + L + α m An vm
= α1λn v1 + α 2 λn v2 + L + α m λn vm
1 2 m
λi < 1 ⇒ convergence
Convergence, number of iterations, it’s all
about eigenvalues…
26. Example 3: Dynamical Systems
( Eigenvalues don’t have the main role here, but, who are
you going to complain to?)
27. Arnold’s Cat
• Poincare recurrence theorem:
“ A system having a finite amount of energy and confined to a
finite spatial volume will, after a sufficiently long time, return
to an arbitrarily small neighborhood of its initial state.”
• Vladimir I. Arnold, Russian mathematician.
⎛1 1 ⎞
⎜1 2 ⎟
A=⎜ ⎟
⎝ ⎠
Each pixel can be assigned to a
unique pair of coordinates
(a two-dimensional vector)
31. More Applications
•Graph theory
•Differential Equations
•PageRank
•Physics
32. REFERENCES
•Chen Greif. CPSC 517 Notes, UBC/CS, Spring 2007.
•Howard Anton and Chris Rorres. Elementary Linear Algebra,
Applications Version, 9th Ed. John Wiley & Sons, Inc. 2005
•Humberto Madrid de la Vega. Eigenfaces: Reconocimiento digital de
facciones mediante SVD. Memorias del XXXVII Congreso SMM, 2005.
•Wikipedia: Eigenvalue, eigenvector and eigenspace.
http://en.wikipedia.org/wiki/Eigenvalue