SlideShare ist ein Scribd-Unternehmen logo
1 von 50
2
A nonzero vector x is an eigenvector (or characteristic
vector) of a square matrix A if there exists a scalar λ
such that Ax = λx. Then λ is an eigenvalue (or
characteristic value) of A.
Note: The zero vector can not be an eigenvector even
though A0 = λ0. But λ = 0 can be an eigenvalue.
Example:
Show x =
2
1





 isaneigenvector for A =
2 −4
3 −6






Solution: Ax =
2 −4
3 −6






2
1





 =
0
0






But for λ =0, λx =0
2
1





 =
0
0






Thus,xisaneigenvectorof A,andλ =0 isaneigenvalue.
An n×n matrix A multiplied by n×1 vector x results in another
n×1 vector y=Ax. Thus A can be considered as a
transformation matrix.
In general, a matrix acts on a vector by changing both its
magnitude and its direction. However, a matrix may act on
certain vectors by changing only their magnitude, and leaving
their direction unchanged (or possibly reversing it). These
vectors are the eigenvectors of the matrix.
A matrix acts on an eigenvector by multiplying its magnitude by
a factor, which is positive if its direction is unchanged and
negative if its direction is reversed. This factor is the eigenvalue
associated with that eigenvector.
Example 1: Find the eigenvalues of
two eigenvalues: −1, − 2
Note: The roots of the characteristic equation can be repeated.That is, λ1 = λ2
=…= λk. If that happens, the eigenvalue is said to be of multiplicity k.
Example 2: Find the eigenvalues of
λ = 2 is an eigenvector of multiplicity 3.






−
−
=
51
122
A
)2)(1(23
12)5)(2(
51
122
2
++=++=
++−=
+−
−
=−
λλλλ
λλ
λ
λ
λ AI










=
200
020
012
A
0)2(
200
020
012
3
=−=
−
−
−−
=− λ
λ
λ
λ
λ AI
5
6
7
Example 1 (cont.):





 −
⇒





−
−
=−−−=
00
41
41
123
)1(:1 AIλ
0,
1
4
,404
2
1
1
2121
≠





=





=
==⇒=−
tt
x
x
txtxxx
x





 −
⇒





−
−
=−−−=
00
31
31
124
)2(:2 AIλ
0,
1
3
2
1
2 ≠





=





= ss
x
x
x
To each distinct eigenvalue of a matrix A there will correspond at least one
eigenvector which can be found by solving the appropriate set of homogenous
equations. If λi is an eigenvalue then the corresponding eigenvector xi is the
solution of (A – λiI)xi = 0
Example 2 (cont.): Find the eigenvectors of
Recall that λ = 2 is an eigenvector of multiplicity 3.
Solve the homogeneous linear system represented by
Let .The eigenvectors of λ = 2 are of the
form
s and t not both zero.










=



















 −
=−
0
0
0
000
000
010
)2(
3
2
1
x
x
x
AI x
txsx == 31 ,
,
1
0
0
0
0
1
0
3
2
1










+










=










=










= ts
t
s
x
x
x
x










=
200
020
012
A
10
11
Definition: The trace of a matrix A, designated by tr(A), is the sum
of the elements on the main diagonal.
Property 1: The sum of the eigenvalues of a matrix equals the
trace of the matrix.
Property 2: A matrix is singular if and only if it has a zero
eigenvalue.
Property 3: The eigenvalues of an upper (or lower) triangular
matrix are the elements on the main diagonal.
Property 4: If λ is an eigenvalue of A and A is invertible, then 1/λ
is an eigenvalue of matrix A-1
.
Property 5: If λ is an eigenvalue of A then kλ is an eigenvalue of
kA where k is any arbitrary scalar.
Property 6: If λ is an eigenvalue of A then λk
is an eigenvalue of
Ak
for any positive integer k.
Property 8: If λ is an eigenvalue of A then λ is an eigenvalue of
AT
.
Property 9: The product of the eigenvalues (counting multiplicity)
of a matrix equals the determinant of the matrix.
Algebraic Multiplicity of an eigenvalue λ is
defined as the order of the eigenvalue as
a root of the characteristics equation and
is denoted by multa(λ) (or M λ)
Geometric multiplicity of λ is defined as
the number of linearly independent
eigenvectors corresponding and is
denoted by multa(λ) (or M λ)
Theorem :- If A is a square matrix then for
every eigenvalue of A, the algebraic
multiplicity is greater than geometric
multiplicity.
This theorem states that Every square
matrix A satisfies its own
characteristics equation; that is, if
λn
+kn-1λn-1
+kn-2λn-2
+…..+k1λ+k0=0
Is the characteristic equation of an n×n
matrix, then
An
+kn-1An-1
+kn-2An-2
+…..+k1A+k0I=0
16
If P = [ p1 p2 ], then AP = PD even if A is not diagonalizable.
Since AP =
1 −2
1 5






−3 −2
−5 0





 = A
r
p1
r
p2




= A
r
p1 A
r
p2




=
−5 −4
−5 10





 =
−5(1) 2(−2)
−5(1) 2(5)








= −5
1
1





 2
−2
5














= λ1
r
p1 λ2
r
p2




=
r
p1
r
p2




λ1 0
0 λ2








=
1
1






−2
5














−5 0
0 2





 =
1 −2
1 5






−5 0
0 2





 = PD.
17
So, our two distinct eigenvalues both have algebraic multiplicity
1 and geometric multiplicity 1. This ensures that p1 and p2 are
not scalar multiples of each other; thus, p1 and p2 are linearly
independent eigenvectors of A.
Since A is 2 x 2 and there are two linearly independent
eigenvectors from the solution of the eigenvalue problem, A
is diagonalizable and P-1
AP = D.
We can now construct P, P-1
and D. Let
Then, P−1
=
5 / 7 2 / 7
−1/ 7 1/ 7





 and D =
λ1 0
0 λ2








=
−5 0
0 2





.
P =
r
p1
r
p2




=
1
1






−2
5














=
1 −2
1 5





.
18
Note that, if we multiply both sides on the left by P−1
, then
AP = A
r
p1
r
p2




== A
r
p1 A
r
p2




=
−5 −4
−5 10





 =
−5(1) 2(−2)
−5(1) 2(5)








= −5
1
1





 2
−2
5














= λ1
r
p1 λ2
r
p2




=
r
p1
r
p2




λ1 0
0 λ2








=
1
1






−2
5














−5 0
0 2





 =
1 −2
1 5






−5 0
0 2





 = PD becomes
P−1
AP =
5 / 7 2 / 7
−1/ 7 1/ 7






1 −2
1 5






−3 −2
−5 0






=
5 / 7 2 / 7
−1/ 7 1/ 7






−5 −4
−5 10





 =
−35 / 7 0 / 7
0 / 7 14 / 7





 =
−5 0
0 2





 = D.
19
Example 2: Find the eigenvalues and eigenvectors for A.
Step 1: Find the eigenvalues for A.
Recall: The determinant of a triangular matrix is the product of
the elements at the diagonal. Thus, the characteristic
equation of A is
λ1 = 1 has algebraic multiplicity 1 and λ2 = 3 has algebraic
multiplicity 2.
.
A =
3 −4 0
0 3 0
0 0 1










p(λ) = det(λI − A) = λI − A = 0
=
λ − 3 4 0
0 λ − 3 0
0 0 λ −1
= (λ − 3)2
(λ −1)1
= 0.
20
Step 2: Use Gaussian elimination with back-substitution to
solve (λI - A) x = 0. For λ1 = 1, the augmented matrix for the
system is
Column 3 is not a leading column and x3 = t is a free variable.
The geometric multiplicity of λ1 = 1 is one, since there is only
one free variable. x2 = 0 and x1 = 2x2 = 0.
I − A |
r
0  =
−2 4 0
0 −2 0
0 0 0
0
0
0










:
− 1
2 r1→ r1
r2
r3
1 −2 0
0 −2 0
0 0 0
0
0
0










:
r1
− 1
2 r2 → r2
r3
1 −2 0
0 1 0
0 0 0
0
0
0










.
21
The eigenvector corresponding to λ1 = 1 is
The dimension of the eigenspace is 1 because the eigenvalue
has only one linearly independent eigenvector. Thus, the
geometric multiplicity is 1 and the algebraic multiplicity is 1
for λ1 = 1 .
r
x =
x1
x2
x3












=
0
0
t










= t
0
0
1










. If we choose t = 1, then
r
p1 =
0
0
1










is
our choice for the eigenvector.
B1 = {
r
p1} is a basis for the eigenspace, Eλ1
, with dim(Eλ1
) = 1.
22
The augmented matrix for the system with λ2 = 3 is
Column 1 is not a leading column and x1 = t is a free variable.
Since there is only one free variable, the geometric
multiplicity of λ2 is one.
3I − A |
r
0  =
0 4 0
0 0 0
0 0 2
0
0
0










:
1
4 r1→ r1
r2
r3
0 1 0
0 0 0
0 0 2
0
0
0










:
r1
r3→ r2
r2 → r3
0 1 0
0 0 2
0 0 0
0
0
0










:
r1
1
2 r2 → r2
r3
0 1 0
0 0 1
0 0 0
0
0
0










.
23
x2 = x3 = 0 and the eigenvector corresponding to λ2 = 3 is
The dimension of the eigenspace is 1 because the eigenvalue
has only one linearly independent eigenvector. Thus, the
geometric multiplicity is 1 while the algebraic multiplicity is 2
for λ2 = 3 . This means there will not be enough linearly
independent eigenvectors for A to be diagonalizable. Thus,
A is not diagonalizable whenever the geometric multiplicity is
less than the algebraic multiplicity for any eigenvalue.
r
x =
x1
x2
x3












=
t
0
0










= t
1
0
0










,we choose t = 1, and
r
p2 =
1
0
0










is
our choice for the eigenvector.
B2 = {
r
p2} is a basis for the eigenspace, Eλ2
, with dim(Eλ2
) = 1.
24
This time, AP = A
r
p1
r
p2
r
p2




= A
r
p1 A
r
p2 A
r
p2




λ1
r
p1 λ2
r
p2 λ2
r
p2




=
r
p1
r
p2
r
p2




λ1 0 0
0 λ2 0
0 0 λ2










= PD.
P−1
does not exist since the columns of P are not linearly independent.
It is not possible to solve for D = P−1
AP, so A is not diagonalizable.
25
AP = A
r
p1
r
p2 ...
r
pn




= A
r
p1 A
r
p2 ... A
r
pn




= λ1
r
p1 λ2
r
p2 ... λn
r
pn




= PD =
r
p1
r
p2 ...
r
pn




λ1 0
O
0 λn












For A, an n x n matrix, with characteristic polynomial roots
for eigenvalues λi of A with corresponding eigenvectors pi.
P is invertible iff the eigenvectors that form its columns are
linearly independent iff
λ1,λ2,...,λn , then
algbraic multiplicity for each distinct λi .
dim(Eλi
) = geometric multiplicity =
26
P−1
AP = P−1
PD = D =
λ1 0
O
0 λn












.
This gives us n linearly independent eigenvectors for P, so
P-1
exists. Therefore, A is diagonalizable since
The square matrices S and T are similar iff there exists a
nonsingular P such that S = P-1
TP or PSP-1
= T.
Since A is similar to a diagonal matrix, A is diagonalizable.
27
Example 4: Solve the eigenvalue problem Ax = λx and find
the eigenspace, algebraic multiplicity, and geometric
multiplicity for each eigenvalue.
Step 1: Write down the characteristic equation of A and solve
for its eigenvalues.
A =
−4 −3 6
0 −1 0
−3 −3 5










p(λ) = λI − A =
λ + 4 3 −6
0 λ +1 0
3 3 λ − 5
= (−1)(λ +1)
λ + 4 −6
3 λ − 5
28
Since the factor (λ - 2) is first power, λ1 = 2 is not a repeated
root. λ1 = 2 has an algebraic multiplicity of 1. On the other
hand, the factor (λ +1) is squared, λ2 = -1 is a repeated root,
and it has an algebraic multiplicity of 2.
p(λ)
= (λ +1)(λ + 4)(λ − 5) +18(λ +1) = 0
= (λ2
+ 5λ + 4)(λ − 5) +18(λ +1) = 0
= (λ3
+ 5λ2
+ 4λ − 5λ2
− 25λ − 20) +18λ +18 = 0
= λ3
− 3λ − 2 = (λ +1)(λ2
− λ − 2) = (λ +1)(λ − 2)(λ +1) = 0
= (λ − 2)(λ +1)2
= 0.
So the eigenvalues are λ1 = 2,λ2 = −1.
29
Step 2: Use Gaussian elimination with back-substitution to
solve (λI - A) x = 0 for λ1 and λ2.
For λ1 = 2 , the augmented matrix for the system is
2I − A |
r
0  =
6
0
3
3
3
3
−6
0
−3
0
0
0










~
1
6 r1→ r1
1
3 r2 → r2
r3
1
0
3
1/ 2
1
3
−1
0
−3
0
0
0










~
r1
r2
−3r1+ r3→ r3
1
0
0
1/ 2
1
3 / 2
−1
0
0
0
0
0










:
r1
r2
− 3
2 r2 + r3→ r3
1
0
0
1/ 2
1
0
−1
0
0
0
0
0










.
In this case,
x3 = r, x2 = 0, and
x1 = -1/2(0) + r
= 0 + r = r.
30
Thus, the eigenvector corresponding to λ1 = 2 is
r
x =
x1
x2
x3










=
r
0
r










= r
1
0
1










,r ≠ 0. If we choose
r
p1 =
1
0
1










,
then B1 =
1
0
1




















is a basis for the eigenspace of λ1 = 2.
Eλ1
= span({
r
p1}) and dim(Eλ1
) = 1, so the geometric multiplicity is 1.
A
r
x = 2
r
x or (2I − A)
r
x =
r
0.
−4 −3 6
0 −1 0
−3 −3 5










1
0
1










=
−4 + 6
0
−3+ 5










=
2
0
2










= 2
1
0
1










.
31
For λ2 = -1, the augmented matrix for the system is
(−1)I − A |
r
0  =
3
0
3
3
0
3
−6
0
−6
0
0
0










~
1
3 r1→ r1
r2
r3
1
0
3
1
0
3
−2
0
−6
0
0
0










~
r1
r2
−3r1+ r3→ r3
1
0
0
1
0
0
−2
0
0
0
0
0










x3 = t, x2 = s, and x1 = -s + 2t. Thus, the solution has two
linearly independent eigenvectors for λ2 = -1 with
r
x =
x1
x2
x3










=
−s + 2t
s
t










= s
−1
1
0










+ t
2
0
1










,s ≠ 0,t ≠ 0.
32
If we choose
r
p2 =
−1
1
0










, and
r
p3 =
2
0
1










, then B2 =
−1
1
0










,
2
0
1




















is a basis for Eλ2
= span({
r
p2 ,
r
p3}) and dim(Eλ2
) = 2,
so the geometric multiplicity is 2.
33
Thus, we have AP = PD as follows :
−4 −3 6
0 −1 0
−3 −3 5










1 −1 2
0 1 0
1 0 1










=
1 −1 2
0 1 0
1 0 1










2 0 0
0 −1 0
0 0 −1










2 1 −2
0 −1 0
2 0 −1










=
2 1 −2
0 −1 0
2 0 −1










.
Since the geometric multiplicity is equal to the algebraic
multiplicity for each distinct eigenvalue, we found three
linearly independent eigenvectors. The matrix A is
diagonalizable since P = [p1 p2 p3] is nonsingular.
34
P | I[ ]=
1 −1 2
0 1 0
1 0 1
1 0 0
0 1 0
0 0 1










:
1 −1 2
0 1 0
0 1 −1
1 0 0
0 1 0
−1 0 1










:
1 0 2
0 1 0
0 0 1
1 1 0
0 1 0
1 1 −1










:
1 0 0
0 1 0
0 0 1
−1 −1 2
0 1 0
1 1 −1










. So, P−1
=
−1 −1 2
0 1 0
1 1 −1










.
We can find P-1
as follows:
35
AP = PD gives us A = APP−1
= PDP−1
.
Thus, PDP−1
=
1 −1 2
0 1 0
1 0 1










2 0 0
0 −1 0
0 0 −1










−1 −1 2
0 1 0
1 1 −1










=
2 1 −2
0 −1 0
2 0 −1










−1 −1 2
0 1 0
1 1 −1










=
−4 −3 6
0 −1 0
−3 −3 5










= A
Note that A and D are similar matrices.
36
D =
−1 −1 2
0 1 0
1 1 −1










−4 −3 6
0 −1 0
−3 −3 5










1 −1 2
0 1 0
1 0 1










=
−2 −2 4
0 −1 0
−1 −1 1










1 −1 2
0 1 0
1 0 1










=
2 0 0
0 −1 0
0 0 −1










.
Also, D = P-1
AP =
So, A and D are similar with D = P-1
AP and A = PD P-1
.
37
If P is an orthogonal matrix, its inverse is its transpose, P-1
=
PT
. Since
PT
P =
r
p1
T
r
p2
T
r
pT
3












r
p1
r
p2
r
p3




==
r
p1
T r
p1
r
p1
T r
p2
r
p1
T r
p3
r
pT
2
r
p1
r
p2
T r
p2
r
p2
T r
p3
r
p3
T r
p1
r
p3
T r
p2
r
pT
3
r
p3












=
r
p1 ⋅
r
p1
r
p1 ⋅
r
p2
r
p1 ⋅
r
p3
r
p2 ⋅
r
p1
r
p2 ⋅
r
p2
r
p2 ⋅
r
p3
r
p3 ⋅
r
p1
r
p3 ⋅
r
p2
r
p3 ⋅
r
p3










=
r
pi ⋅
r
pj
  = I because
r
pi ⋅
r
pj = 0
for i ≠ j and
r
pi ⋅
r
pj = 1 for i = j = 1,2,3. So, P−1
= PT
.
38
A is a symmetric matrix if A = AT
. Let A be diagonalizable so
that A = PDP-1
. But A = AT
and
AT
= (PDP−1
)T
= (PDPT
)T
= (PT
)T
DT
PT
= PDPT
= A.
This shows that for a symmetric matrix A to be
diagonalizable, P must be orthogonal.
If P-1
≠ PT
, then A ≠AT
. The eigenvectors of A are mutually
orthogonal but not orthonormal. This means that the
eigenvectors must be scaled to unit vectors so that P is
orthogonal and composed of orthonormal columns.
39
Example 5: Determine if the symmetric matrix A is
diagonalizable; if it is, then find the orthogonal matrix P that
orthogonally diagonalizes the symmetric matrix A.
Let A =
5 −1 0
−1 5 0
0 0 −2










, then det(λI − A) =
λ − 5 1 0
1 λ − 5 0
0 0 λ + 2
= (−1)3+1
(λ + 2)
λ − 5 1
1 λ − 5
= (λ + 2)(λ − 5)2
− (λ + 2)
= λ3
− 8λ2
+ 4λ + 48 = (λ − 4)(λ2
− 4λ −12) = (λ − 4)(λ + 2)(λ − 6) = 0
Thus, λ1 = 4, λ2 = −2, λ3 = 6.
Since we have three distinct eigenvalues, we will see that we are
guaranteed to have three linearly independent eigenvectors.
40
Since λ1 = 4, λ2 = -2, and λ3 = 6, are distinct eigenvalues, each
of the eigenvalues has algebraic multiplicity 1.
An eigenvalue must have geometric multiplicity of at least one.
Otherwise, we will have the trivial solution. Thus, we have three
linearly independent eigenvectors.
We will use Gaussian elimination with back-substitution as
follows:
41
For λ1 = 4 ,
λ1I − A
r
0


 =
−1 1 0
1 −1 0
0 0 6
0
0
0










:
1 −1 0
0 0 0
0 0 1
0
0
0










:
1 −1 0
0 0 1
0 0 0
0
0
0










x2 = s, x3 = 0, x1 = s .
r
x =
x1
x2
x3










= s
1
1
0










or
r
p1 =
1/ 2
1/ 2
0










.
42
For λ2 = −2 ,
λ2 I − A
r
0


 =
−7 1 0
1 −7 0
0 0 0
0
0
0










:
1 −7 0
0 1 0
0 0 0
0
0
0










x3 = s, x2 = 0, x1 = 0 .
r
x =
x1
x2
x3










= s
0
0
1










or
r
p2 =
0
0
1










.
43
For λ3 = 6,
λ3I − A
r
0


 =
1 1 0
1 1 0
0 0 8
0
0
0










:
1 1 0
0 0 1
0 0 0
0
0
0










x2 = s, x3 = 0, x1 = −s.
r
x =
x1
x2
x3










= s
−1
1
0










or
r
p3 =
−1/ 2
1/ 2
0










.
44
As we can see the eigenvectors of A are distinct, so {p1, p2,
p3} is linearly independent, P-1
exists for P =[p1 p2 p3] and
Thus A is diagonalizable.
Since A = AT
(A is a symmetric matrix) and P is orthogonal
with approximate scaling of p1, p2, p3, P-1
= PT
.
AP = PD ⇔ PDP−1
.
PP−1
= PPT
=
1/ 2 0 −1/ 2
1/ 2 0 1/ 2
0 1 0










1/ 2 1/ 2 0
0 0 1
−1/ 2 1/ 2 0










=
1 0 0
0 1 0
0 0 1










= I.
45
As we can see the eigenvectors of A are distinct, so {p1, p2,
p3} is linearly independent, P-1
exists for P =[p1 p2 p3] and
Thus A is diagonalizable.
Since A = AT
(A is a symmetric matrix) and P is orthogonal
with approximate scaling of p1, p2, p3, P-1
= PT
.
AP = PD ⇔ PDP−1
.
PP−1
= PPT
=
1/ 2 0 −1/ 2
1/ 2 0 1/ 2
0 1 0










1/ 2 1/ 2 0
0 0 1
−1/ 2 1/ 2 0










=
1 0 0
0 1 0
0 0 1










= I.
46
PDPT
=
1/ 2 0 −1/ 2
1/ 2 0 1/ 2
0 1 0










4 0 0
0 −2 0
0 0 6










1/ 2 1/ 2 0
0 0 1
−1/ 2 1/ 2 0










=
4 / 2 0 −6 / 2
4 / 2 0 6 / 2
0 −2 0










1/ 2 1/ 2 0
0 0 1
−1/ 2 1/ 2 0










=
5 −1 0
−1 5 0
0 0 −2










= A.
Note that A and D are similar matrices. PD P-1
=
47
=
1/ 2 1/ 2 0
0 0 1
−1/ 2 1/ 2 0










5 −1 0
−1 5 0
0 0 −2










1/ 2 0 −1/ 2
1/ 2 0 1/ 2
0 1 0










=
4 / 2 4 / 2 0
0 0 −2
−6 / 2 6 / 2 0










1/ 2 0 −1/ 2
1/ 2 0 1/ 2
0 1 0










=
4 0 0
0 −2 0
0 0 6










.
Also, D = P-1
AP = PT
AP
So, A and D are similar with D = PT
AP and A = PDPT
.
48
=
1/ 2 0 −1/ 2
1/ 2 0 1/ 2
0 1 0










4 0 0
0 −2 0
0 0 6










T
1/ 2 0 −1/ 2
1/ 2 0 1/ 2
0 1 0










T
=
1/ 2 0 −1/ 2
1/ 2 0 1/ 2
0 1 0










4 0 0
0 −2 0
0 0 6










1/ 2 1/ 2 0
0 0 1
−1/ 2 1/ 2 0










=
4 / 2 0 −6 / 2
4 / 2 0 6 / 2
0 −2 0










1/ 2 1/ 2 0
0 0 1
−1/ 2 1/ 2 0










=
5 −1 0
−1 5 0
0 0 −2










AT
= (PD P-1
)T
= (PD PT
)T
= (PT
)T
DT
PT
= P DT
PT
= A. This shows that if A is a symmetric matrix, P must be
orthogonal with P-1
= PT
.
49
From Example 1, the diagonal matrix for matrix A is :
D =
λ1 0
0 λ2








=
−5 0
0 2





 = P−1
AP = D ⇒ A = PDP−1
,
and A3
= PDP−1
PDP−1
PDP−1
= PD3
P−1
=
1 −2
1 5






(−5)3
0
0 (2)3








5 / 7 2 / 7
−1/ 7 1/ 7





 =
125 −16
125 40






5 / 7 2 / 7
−1/ 7 1/ 7






=
641/ 7 234 / 7
585 / 7 290 / 7





. For A3
,the eigenvalues, are λ1
3
= −125 aand λ2
3
= 8.
In general, the power of a matrix, Ak
= PDk
P−1
. and the eigenvalues are λi
k
,
where λi is on the main diagonal of D.
50

Weitere ähnliche Inhalte

Was ist angesagt?

Was ist angesagt? (20)

vector space and subspace
vector space and subspacevector space and subspace
vector space and subspace
 
Cauchy riemann equations
Cauchy riemann equationsCauchy riemann equations
Cauchy riemann equations
 
Eigenvalues and Eigenvector
Eigenvalues and EigenvectorEigenvalues and Eigenvector
Eigenvalues and Eigenvector
 
Eigenvectors & Eigenvalues: The Road to Diagonalisation
Eigenvectors & Eigenvalues: The Road to DiagonalisationEigenvectors & Eigenvalues: The Road to Diagonalisation
Eigenvectors & Eigenvalues: The Road to Diagonalisation
 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectors
 
Ordinary differential equation
Ordinary differential equationOrdinary differential equation
Ordinary differential equation
 
Eigenvalue problems .ppt
Eigenvalue problems .pptEigenvalue problems .ppt
Eigenvalue problems .ppt
 
Eigen value and vectors
Eigen value and vectorsEigen value and vectors
Eigen value and vectors
 
Matrix algebra
Matrix algebraMatrix algebra
Matrix algebra
 
Eigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theoremEigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theorem
 
Differential equations
Differential equationsDifferential equations
Differential equations
 
Independence, basis and dimension
Independence, basis and dimensionIndependence, basis and dimension
Independence, basis and dimension
 
Orthogonal Matrices
Orthogonal MatricesOrthogonal Matrices
Orthogonal Matrices
 
Differential equations
Differential equationsDifferential equations
Differential equations
 
Linear algebra-Basis & Dimension
Linear algebra-Basis & DimensionLinear algebra-Basis & Dimension
Linear algebra-Basis & Dimension
 
LinearAlgebra.ppt
LinearAlgebra.pptLinearAlgebra.ppt
LinearAlgebra.ppt
 
Eigenvalues and eigenvectors
Eigenvalues and eigenvectorsEigenvalues and eigenvectors
Eigenvalues and eigenvectors
 
Vector space
Vector spaceVector space
Vector space
 
Cayley – hamiltion theorem By Sunny
Cayley – hamiltion theorem By SunnyCayley – hamiltion theorem By Sunny
Cayley – hamiltion theorem By Sunny
 
Vector spaces
Vector spaces Vector spaces
Vector spaces
 

Andere mochten auch

Lesson14: Eigenvalues And Eigenvectors
Lesson14: Eigenvalues And EigenvectorsLesson14: Eigenvalues And Eigenvectors
Lesson14: Eigenvalues And EigenvectorsMatthew Leingang
 
Eigenvalues and eigenvectors of symmetric matrices
Eigenvalues and eigenvectors of symmetric matricesEigenvalues and eigenvectors of symmetric matrices
Eigenvalues and eigenvectors of symmetric matricesIvan Mateev
 
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6njit-ronbrown
 
Lecture 11 diagonalization & complex eigenvalues - 5-3 & 5-5
Lecture  11   diagonalization & complex eigenvalues -  5-3 & 5-5Lecture  11   diagonalization & complex eigenvalues -  5-3 & 5-5
Lecture 11 diagonalization & complex eigenvalues - 5-3 & 5-5njit-ronbrown
 
Lesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsLesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsMatthew Leingang
 
Eigenvalues in a Nutshell
Eigenvalues in a NutshellEigenvalues in a Nutshell
Eigenvalues in a Nutshellguest9006ab
 
Matrices & determinants
Matrices & determinantsMatrices & determinants
Matrices & determinantsindu thakur
 
eigen valuesandeigenvectors
eigen valuesandeigenvectorseigen valuesandeigenvectors
eigen valuesandeigenvectors8laddu8
 
B.tech semester i-unit-v_eigen values and eigen vectors
B.tech semester i-unit-v_eigen values and eigen vectorsB.tech semester i-unit-v_eigen values and eigen vectors
B.tech semester i-unit-v_eigen values and eigen vectorsRai University
 
Null space, Rank and nullity theorem
Null space, Rank and nullity theoremNull space, Rank and nullity theorem
Null space, Rank and nullity theoremRonak Machhi
 
Linear Algebra and Matrix
Linear Algebra and MatrixLinear Algebra and Matrix
Linear Algebra and Matrixitutor
 
presentation on matrix
 presentation on matrix presentation on matrix
presentation on matrixNikhi Jain
 
Matrices And Application Of Matrices
Matrices And Application Of MatricesMatrices And Application Of Matrices
Matrices And Application Of Matricesmailrenuka
 
TEDx Manchester: AI & The Future of Work
TEDx Manchester: AI & The Future of WorkTEDx Manchester: AI & The Future of Work
TEDx Manchester: AI & The Future of WorkVolker Hirsch
 

Andere mochten auch (17)

Lesson14: Eigenvalues And Eigenvectors
Lesson14: Eigenvalues And EigenvectorsLesson14: Eigenvalues And Eigenvectors
Lesson14: Eigenvalues And Eigenvectors
 
Eigenvalues and eigenvectors of symmetric matrices
Eigenvalues and eigenvectors of symmetric matricesEigenvalues and eigenvectors of symmetric matrices
Eigenvalues and eigenvectors of symmetric matrices
 
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
 
Eigen vector
Eigen vectorEigen vector
Eigen vector
 
Lecture 11 diagonalization & complex eigenvalues - 5-3 & 5-5
Lecture  11   diagonalization & complex eigenvalues -  5-3 & 5-5Lecture  11   diagonalization & complex eigenvalues -  5-3 & 5-5
Lecture 11 diagonalization & complex eigenvalues - 5-3 & 5-5
 
Lesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsLesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear Equations
 
Eigenvalues in a Nutshell
Eigenvalues in a NutshellEigenvalues in a Nutshell
Eigenvalues in a Nutshell
 
Matrices
MatricesMatrices
Matrices
 
Matrices & determinants
Matrices & determinantsMatrices & determinants
Matrices & determinants
 
eigen valuesandeigenvectors
eigen valuesandeigenvectorseigen valuesandeigenvectors
eigen valuesandeigenvectors
 
B.tech semester i-unit-v_eigen values and eigen vectors
B.tech semester i-unit-v_eigen values and eigen vectorsB.tech semester i-unit-v_eigen values and eigen vectors
B.tech semester i-unit-v_eigen values and eigen vectors
 
Null space, Rank and nullity theorem
Null space, Rank and nullity theoremNull space, Rank and nullity theorem
Null space, Rank and nullity theorem
 
Linear Algebra and Matrix
Linear Algebra and MatrixLinear Algebra and Matrix
Linear Algebra and Matrix
 
presentation on matrix
 presentation on matrix presentation on matrix
presentation on matrix
 
Matrices And Application Of Matrices
Matrices And Application Of MatricesMatrices And Application Of Matrices
Matrices And Application Of Matrices
 
rank of matrix
rank of matrixrank of matrix
rank of matrix
 
TEDx Manchester: AI & The Future of Work
TEDx Manchester: AI & The Future of WorkTEDx Manchester: AI & The Future of Work
TEDx Manchester: AI & The Future of Work
 

Ähnlich wie Eigen values and eigenvectors

eigenvalueandeigenvector72-80-160505220126 (1).pdf
eigenvalueandeigenvector72-80-160505220126 (1).pdfeigenvalueandeigenvector72-80-160505220126 (1).pdf
eigenvalueandeigenvector72-80-160505220126 (1).pdfSunny432360
 
Unit i
Unit i Unit i
Unit i sunmo
 
Partial midterm set7 soln linear algebra
Partial midterm set7 soln linear algebraPartial midterm set7 soln linear algebra
Partial midterm set7 soln linear algebrameezanchand
 
Section 18.3-19.1.Today we will discuss finite-dimensional.docx
Section 18.3-19.1.Today we will discuss finite-dimensional.docxSection 18.3-19.1.Today we will discuss finite-dimensional.docx
Section 18.3-19.1.Today we will discuss finite-dimensional.docxkenjordan97598
 
Section 18.3-19.1.Today we will discuss finite-dimensional.docx
Section 18.3-19.1.Today we will discuss finite-dimensional.docxSection 18.3-19.1.Today we will discuss finite-dimensional.docx
Section 18.3-19.1.Today we will discuss finite-dimensional.docxrtodd280
 
derogatory and non derogatory matrices
derogatory and non derogatory matricesderogatory and non derogatory matrices
derogatory and non derogatory matricesKomal Singh
 
Notes on eigenvalues
Notes on eigenvaluesNotes on eigenvalues
Notes on eigenvaluesAmanSaeed11
 
Refresher algebra-calculus
Refresher algebra-calculusRefresher algebra-calculus
Refresher algebra-calculusSteve Nouri
 
Engg maths k notes(4)
Engg maths k notes(4)Engg maths k notes(4)
Engg maths k notes(4)Ranjay Kumar
 
Eighan values and diagonalization
Eighan values and diagonalization Eighan values and diagonalization
Eighan values and diagonalization gandhinagar
 
Diagonalization of matrix
Diagonalization of matrixDiagonalization of matrix
Diagonalization of matrixVANDANASAINI29
 

Ähnlich wie Eigen values and eigenvectors (20)

eigenvalue
eigenvalueeigenvalue
eigenvalue
 
eigenvalueandeigenvector72-80-160505220126 (1).pdf
eigenvalueandeigenvector72-80-160505220126 (1).pdfeigenvalueandeigenvector72-80-160505220126 (1).pdf
eigenvalueandeigenvector72-80-160505220126 (1).pdf
 
Eigenvalues
EigenvaluesEigenvalues
Eigenvalues
 
Lecture_06_Part_1.ppt
Lecture_06_Part_1.pptLecture_06_Part_1.ppt
Lecture_06_Part_1.ppt
 
Positive
PositivePositive
Positive
 
Unit i
Unit i Unit i
Unit i
 
Partial midterm set7 soln linear algebra
Partial midterm set7 soln linear algebraPartial midterm set7 soln linear algebra
Partial midterm set7 soln linear algebra
 
DOC-20231230-WA0001..pdf
DOC-20231230-WA0001..pdfDOC-20231230-WA0001..pdf
DOC-20231230-WA0001..pdf
 
Section 18.3-19.1.Today we will discuss finite-dimensional.docx
Section 18.3-19.1.Today we will discuss finite-dimensional.docxSection 18.3-19.1.Today we will discuss finite-dimensional.docx
Section 18.3-19.1.Today we will discuss finite-dimensional.docx
 
Section 18.3-19.1.Today we will discuss finite-dimensional.docx
Section 18.3-19.1.Today we will discuss finite-dimensional.docxSection 18.3-19.1.Today we will discuss finite-dimensional.docx
Section 18.3-19.1.Today we will discuss finite-dimensional.docx
 
derogatory and non derogatory matrices
derogatory and non derogatory matricesderogatory and non derogatory matrices
derogatory and non derogatory matrices
 
Matrices ppt
Matrices pptMatrices ppt
Matrices ppt
 
Notes on eigenvalues
Notes on eigenvaluesNotes on eigenvalues
Notes on eigenvalues
 
Note.pdf
Note.pdfNote.pdf
Note.pdf
 
eigen_values.pptx
eigen_values.pptxeigen_values.pptx
eigen_values.pptx
 
Refresher algebra-calculus
Refresher algebra-calculusRefresher algebra-calculus
Refresher algebra-calculus
 
Engg maths k notes(4)
Engg maths k notes(4)Engg maths k notes(4)
Engg maths k notes(4)
 
Lecture_note2.pdf
Lecture_note2.pdfLecture_note2.pdf
Lecture_note2.pdf
 
Eighan values and diagonalization
Eighan values and diagonalization Eighan values and diagonalization
Eighan values and diagonalization
 
Diagonalization of matrix
Diagonalization of matrixDiagonalization of matrix
Diagonalization of matrix
 

Kürzlich hochgeladen

Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxKartikeyaDwivedi3
 
Arduino_CSE ece ppt for working and principal of arduino.ppt
Arduino_CSE ece ppt for working and principal of arduino.pptArduino_CSE ece ppt for working and principal of arduino.ppt
Arduino_CSE ece ppt for working and principal of arduino.pptSAURABHKUMAR892774
 
Correctly Loading Incremental Data at Scale
Correctly Loading Incremental Data at ScaleCorrectly Loading Incremental Data at Scale
Correctly Loading Incremental Data at ScaleAlluxio, Inc.
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AIabhishek36461
 
Introduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxIntroduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxk795866
 
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncWhy does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncssuser2ae721
 
8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitterShivangiSharma879191
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
Indian Dairy Industry Present Status and.ppt
Indian Dairy Industry Present Status and.pptIndian Dairy Industry Present Status and.ppt
Indian Dairy Industry Present Status and.pptMadan Karki
 
Solving The Right Triangles PowerPoint 2.ppt
Solving The Right Triangles PowerPoint 2.pptSolving The Right Triangles PowerPoint 2.ppt
Solving The Right Triangles PowerPoint 2.pptJasonTagapanGulla
 
Electronically Controlled suspensions system .pdf
Electronically Controlled suspensions system .pdfElectronically Controlled suspensions system .pdf
Electronically Controlled suspensions system .pdfme23b1001
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girlsssuser7cb4ff
 
Risk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfRisk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfROCENODodongVILLACER
 
lifi-technology with integration of IOT.pptx
lifi-technology with integration of IOT.pptxlifi-technology with integration of IOT.pptx
lifi-technology with integration of IOT.pptxsomshekarkn64
 
Work Experience-Dalton Park.pptxfvvvvvvv
Work Experience-Dalton Park.pptxfvvvvvvvWork Experience-Dalton Park.pptxfvvvvvvv
Work Experience-Dalton Park.pptxfvvvvvvvLewisJB
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 

Kürzlich hochgeladen (20)

young call girls in Green Park🔝 9953056974 🔝 escort Service
young call girls in Green Park🔝 9953056974 🔝 escort Serviceyoung call girls in Green Park🔝 9953056974 🔝 escort Service
young call girls in Green Park🔝 9953056974 🔝 escort Service
 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptx
 
Arduino_CSE ece ppt for working and principal of arduino.ppt
Arduino_CSE ece ppt for working and principal of arduino.pptArduino_CSE ece ppt for working and principal of arduino.ppt
Arduino_CSE ece ppt for working and principal of arduino.ppt
 
Correctly Loading Incremental Data at Scale
Correctly Loading Incremental Data at ScaleCorrectly Loading Incremental Data at Scale
Correctly Loading Incremental Data at Scale
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AI
 
Introduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxIntroduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptx
 
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncWhy does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
 
🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
 
8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter
 
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
POWER SYSTEMS-1 Complete notes examples
POWER SYSTEMS-1 Complete notes  examplesPOWER SYSTEMS-1 Complete notes  examples
POWER SYSTEMS-1 Complete notes examples
 
Indian Dairy Industry Present Status and.ppt
Indian Dairy Industry Present Status and.pptIndian Dairy Industry Present Status and.ppt
Indian Dairy Industry Present Status and.ppt
 
Solving The Right Triangles PowerPoint 2.ppt
Solving The Right Triangles PowerPoint 2.pptSolving The Right Triangles PowerPoint 2.ppt
Solving The Right Triangles PowerPoint 2.ppt
 
Electronically Controlled suspensions system .pdf
Electronically Controlled suspensions system .pdfElectronically Controlled suspensions system .pdf
Electronically Controlled suspensions system .pdf
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girls
 
Risk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfRisk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdf
 
lifi-technology with integration of IOT.pptx
lifi-technology with integration of IOT.pptxlifi-technology with integration of IOT.pptx
lifi-technology with integration of IOT.pptx
 
Work Experience-Dalton Park.pptxfvvvvvvv
Work Experience-Dalton Park.pptxfvvvvvvvWork Experience-Dalton Park.pptxfvvvvvvv
Work Experience-Dalton Park.pptxfvvvvvvv
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 

Eigen values and eigenvectors

  • 1.
  • 2. 2 A nonzero vector x is an eigenvector (or characteristic vector) of a square matrix A if there exists a scalar λ such that Ax = λx. Then λ is an eigenvalue (or characteristic value) of A. Note: The zero vector can not be an eigenvector even though A0 = λ0. But λ = 0 can be an eigenvalue. Example: Show x = 2 1       isaneigenvector for A = 2 −4 3 −6       Solution: Ax = 2 −4 3 −6       2 1       = 0 0       But for λ =0, λx =0 2 1       = 0 0       Thus,xisaneigenvectorof A,andλ =0 isaneigenvalue.
  • 3. An n×n matrix A multiplied by n×1 vector x results in another n×1 vector y=Ax. Thus A can be considered as a transformation matrix. In general, a matrix acts on a vector by changing both its magnitude and its direction. However, a matrix may act on certain vectors by changing only their magnitude, and leaving their direction unchanged (or possibly reversing it). These vectors are the eigenvectors of the matrix. A matrix acts on an eigenvector by multiplying its magnitude by a factor, which is positive if its direction is unchanged and negative if its direction is reversed. This factor is the eigenvalue associated with that eigenvector.
  • 4. Example 1: Find the eigenvalues of two eigenvalues: −1, − 2 Note: The roots of the characteristic equation can be repeated.That is, λ1 = λ2 =…= λk. If that happens, the eigenvalue is said to be of multiplicity k. Example 2: Find the eigenvalues of λ = 2 is an eigenvector of multiplicity 3.       − − = 51 122 A )2)(1(23 12)5)(2( 51 122 2 ++=++= ++−= +− − =− λλλλ λλ λ λ λ AI           = 200 020 012 A 0)2( 200 020 012 3 =−= − − −− =− λ λ λ λ λ AI
  • 5. 5
  • 6. 6
  • 7. 7
  • 8. Example 1 (cont.):       − ⇒      − − =−−−= 00 41 41 123 )1(:1 AIλ 0, 1 4 ,404 2 1 1 2121 ≠      =      = ==⇒=− tt x x txtxxx x       − ⇒      − − =−−−= 00 31 31 124 )2(:2 AIλ 0, 1 3 2 1 2 ≠      =      = ss x x x To each distinct eigenvalue of a matrix A there will correspond at least one eigenvector which can be found by solving the appropriate set of homogenous equations. If λi is an eigenvalue then the corresponding eigenvector xi is the solution of (A – λiI)xi = 0
  • 9. Example 2 (cont.): Find the eigenvectors of Recall that λ = 2 is an eigenvector of multiplicity 3. Solve the homogeneous linear system represented by Let .The eigenvectors of λ = 2 are of the form s and t not both zero.           =                     − =− 0 0 0 000 000 010 )2( 3 2 1 x x x AI x txsx == 31 , , 1 0 0 0 0 1 0 3 2 1           +           =           =           = ts t s x x x x           = 200 020 012 A
  • 10. 10
  • 11. 11
  • 12. Definition: The trace of a matrix A, designated by tr(A), is the sum of the elements on the main diagonal. Property 1: The sum of the eigenvalues of a matrix equals the trace of the matrix. Property 2: A matrix is singular if and only if it has a zero eigenvalue. Property 3: The eigenvalues of an upper (or lower) triangular matrix are the elements on the main diagonal. Property 4: If λ is an eigenvalue of A and A is invertible, then 1/λ is an eigenvalue of matrix A-1 .
  • 13. Property 5: If λ is an eigenvalue of A then kλ is an eigenvalue of kA where k is any arbitrary scalar. Property 6: If λ is an eigenvalue of A then λk is an eigenvalue of Ak for any positive integer k. Property 8: If λ is an eigenvalue of A then λ is an eigenvalue of AT . Property 9: The product of the eigenvalues (counting multiplicity) of a matrix equals the determinant of the matrix.
  • 14. Algebraic Multiplicity of an eigenvalue λ is defined as the order of the eigenvalue as a root of the characteristics equation and is denoted by multa(λ) (or M λ) Geometric multiplicity of λ is defined as the number of linearly independent eigenvectors corresponding and is denoted by multa(λ) (or M λ) Theorem :- If A is a square matrix then for every eigenvalue of A, the algebraic multiplicity is greater than geometric multiplicity.
  • 15. This theorem states that Every square matrix A satisfies its own characteristics equation; that is, if λn +kn-1λn-1 +kn-2λn-2 +…..+k1λ+k0=0 Is the characteristic equation of an n×n matrix, then An +kn-1An-1 +kn-2An-2 +…..+k1A+k0I=0
  • 16. 16 If P = [ p1 p2 ], then AP = PD even if A is not diagonalizable. Since AP = 1 −2 1 5       −3 −2 −5 0       = A r p1 r p2     = A r p1 A r p2     = −5 −4 −5 10       = −5(1) 2(−2) −5(1) 2(5)         = −5 1 1       2 −2 5               = λ1 r p1 λ2 r p2     = r p1 r p2     λ1 0 0 λ2         = 1 1       −2 5               −5 0 0 2       = 1 −2 1 5       −5 0 0 2       = PD.
  • 17. 17 So, our two distinct eigenvalues both have algebraic multiplicity 1 and geometric multiplicity 1. This ensures that p1 and p2 are not scalar multiples of each other; thus, p1 and p2 are linearly independent eigenvectors of A. Since A is 2 x 2 and there are two linearly independent eigenvectors from the solution of the eigenvalue problem, A is diagonalizable and P-1 AP = D. We can now construct P, P-1 and D. Let Then, P−1 = 5 / 7 2 / 7 −1/ 7 1/ 7       and D = λ1 0 0 λ2         = −5 0 0 2      . P = r p1 r p2     = 1 1       −2 5               = 1 −2 1 5      .
  • 18. 18 Note that, if we multiply both sides on the left by P−1 , then AP = A r p1 r p2     == A r p1 A r p2     = −5 −4 −5 10       = −5(1) 2(−2) −5(1) 2(5)         = −5 1 1       2 −2 5               = λ1 r p1 λ2 r p2     = r p1 r p2     λ1 0 0 λ2         = 1 1       −2 5               −5 0 0 2       = 1 −2 1 5       −5 0 0 2       = PD becomes P−1 AP = 5 / 7 2 / 7 −1/ 7 1/ 7       1 −2 1 5       −3 −2 −5 0       = 5 / 7 2 / 7 −1/ 7 1/ 7       −5 −4 −5 10       = −35 / 7 0 / 7 0 / 7 14 / 7       = −5 0 0 2       = D.
  • 19. 19 Example 2: Find the eigenvalues and eigenvectors for A. Step 1: Find the eigenvalues for A. Recall: The determinant of a triangular matrix is the product of the elements at the diagonal. Thus, the characteristic equation of A is λ1 = 1 has algebraic multiplicity 1 and λ2 = 3 has algebraic multiplicity 2. . A = 3 −4 0 0 3 0 0 0 1           p(λ) = det(λI − A) = λI − A = 0 = λ − 3 4 0 0 λ − 3 0 0 0 λ −1 = (λ − 3)2 (λ −1)1 = 0.
  • 20. 20 Step 2: Use Gaussian elimination with back-substitution to solve (λI - A) x = 0. For λ1 = 1, the augmented matrix for the system is Column 3 is not a leading column and x3 = t is a free variable. The geometric multiplicity of λ1 = 1 is one, since there is only one free variable. x2 = 0 and x1 = 2x2 = 0. I − A | r 0  = −2 4 0 0 −2 0 0 0 0 0 0 0           : − 1 2 r1→ r1 r2 r3 1 −2 0 0 −2 0 0 0 0 0 0 0           : r1 − 1 2 r2 → r2 r3 1 −2 0 0 1 0 0 0 0 0 0 0           .
  • 21. 21 The eigenvector corresponding to λ1 = 1 is The dimension of the eigenspace is 1 because the eigenvalue has only one linearly independent eigenvector. Thus, the geometric multiplicity is 1 and the algebraic multiplicity is 1 for λ1 = 1 . r x = x1 x2 x3             = 0 0 t           = t 0 0 1           . If we choose t = 1, then r p1 = 0 0 1           is our choice for the eigenvector. B1 = { r p1} is a basis for the eigenspace, Eλ1 , with dim(Eλ1 ) = 1.
  • 22. 22 The augmented matrix for the system with λ2 = 3 is Column 1 is not a leading column and x1 = t is a free variable. Since there is only one free variable, the geometric multiplicity of λ2 is one. 3I − A | r 0  = 0 4 0 0 0 0 0 0 2 0 0 0           : 1 4 r1→ r1 r2 r3 0 1 0 0 0 0 0 0 2 0 0 0           : r1 r3→ r2 r2 → r3 0 1 0 0 0 2 0 0 0 0 0 0           : r1 1 2 r2 → r2 r3 0 1 0 0 0 1 0 0 0 0 0 0           .
  • 23. 23 x2 = x3 = 0 and the eigenvector corresponding to λ2 = 3 is The dimension of the eigenspace is 1 because the eigenvalue has only one linearly independent eigenvector. Thus, the geometric multiplicity is 1 while the algebraic multiplicity is 2 for λ2 = 3 . This means there will not be enough linearly independent eigenvectors for A to be diagonalizable. Thus, A is not diagonalizable whenever the geometric multiplicity is less than the algebraic multiplicity for any eigenvalue. r x = x1 x2 x3             = t 0 0           = t 1 0 0           ,we choose t = 1, and r p2 = 1 0 0           is our choice for the eigenvector. B2 = { r p2} is a basis for the eigenspace, Eλ2 , with dim(Eλ2 ) = 1.
  • 24. 24 This time, AP = A r p1 r p2 r p2     = A r p1 A r p2 A r p2     λ1 r p1 λ2 r p2 λ2 r p2     = r p1 r p2 r p2     λ1 0 0 0 λ2 0 0 0 λ2           = PD. P−1 does not exist since the columns of P are not linearly independent. It is not possible to solve for D = P−1 AP, so A is not diagonalizable.
  • 25. 25 AP = A r p1 r p2 ... r pn     = A r p1 A r p2 ... A r pn     = λ1 r p1 λ2 r p2 ... λn r pn     = PD = r p1 r p2 ... r pn     λ1 0 O 0 λn             For A, an n x n matrix, with characteristic polynomial roots for eigenvalues λi of A with corresponding eigenvectors pi. P is invertible iff the eigenvectors that form its columns are linearly independent iff λ1,λ2,...,λn , then algbraic multiplicity for each distinct λi . dim(Eλi ) = geometric multiplicity =
  • 26. 26 P−1 AP = P−1 PD = D = λ1 0 O 0 λn             . This gives us n linearly independent eigenvectors for P, so P-1 exists. Therefore, A is diagonalizable since The square matrices S and T are similar iff there exists a nonsingular P such that S = P-1 TP or PSP-1 = T. Since A is similar to a diagonal matrix, A is diagonalizable.
  • 27. 27 Example 4: Solve the eigenvalue problem Ax = λx and find the eigenspace, algebraic multiplicity, and geometric multiplicity for each eigenvalue. Step 1: Write down the characteristic equation of A and solve for its eigenvalues. A = −4 −3 6 0 −1 0 −3 −3 5           p(λ) = λI − A = λ + 4 3 −6 0 λ +1 0 3 3 λ − 5 = (−1)(λ +1) λ + 4 −6 3 λ − 5
  • 28. 28 Since the factor (λ - 2) is first power, λ1 = 2 is not a repeated root. λ1 = 2 has an algebraic multiplicity of 1. On the other hand, the factor (λ +1) is squared, λ2 = -1 is a repeated root, and it has an algebraic multiplicity of 2. p(λ) = (λ +1)(λ + 4)(λ − 5) +18(λ +1) = 0 = (λ2 + 5λ + 4)(λ − 5) +18(λ +1) = 0 = (λ3 + 5λ2 + 4λ − 5λ2 − 25λ − 20) +18λ +18 = 0 = λ3 − 3λ − 2 = (λ +1)(λ2 − λ − 2) = (λ +1)(λ − 2)(λ +1) = 0 = (λ − 2)(λ +1)2 = 0. So the eigenvalues are λ1 = 2,λ2 = −1.
  • 29. 29 Step 2: Use Gaussian elimination with back-substitution to solve (λI - A) x = 0 for λ1 and λ2. For λ1 = 2 , the augmented matrix for the system is 2I − A | r 0  = 6 0 3 3 3 3 −6 0 −3 0 0 0           ~ 1 6 r1→ r1 1 3 r2 → r2 r3 1 0 3 1/ 2 1 3 −1 0 −3 0 0 0           ~ r1 r2 −3r1+ r3→ r3 1 0 0 1/ 2 1 3 / 2 −1 0 0 0 0 0           : r1 r2 − 3 2 r2 + r3→ r3 1 0 0 1/ 2 1 0 −1 0 0 0 0 0           . In this case, x3 = r, x2 = 0, and x1 = -1/2(0) + r = 0 + r = r.
  • 30. 30 Thus, the eigenvector corresponding to λ1 = 2 is r x = x1 x2 x3           = r 0 r           = r 1 0 1           ,r ≠ 0. If we choose r p1 = 1 0 1           , then B1 = 1 0 1                     is a basis for the eigenspace of λ1 = 2. Eλ1 = span({ r p1}) and dim(Eλ1 ) = 1, so the geometric multiplicity is 1. A r x = 2 r x or (2I − A) r x = r 0. −4 −3 6 0 −1 0 −3 −3 5           1 0 1           = −4 + 6 0 −3+ 5           = 2 0 2           = 2 1 0 1           .
  • 31. 31 For λ2 = -1, the augmented matrix for the system is (−1)I − A | r 0  = 3 0 3 3 0 3 −6 0 −6 0 0 0           ~ 1 3 r1→ r1 r2 r3 1 0 3 1 0 3 −2 0 −6 0 0 0           ~ r1 r2 −3r1+ r3→ r3 1 0 0 1 0 0 −2 0 0 0 0 0           x3 = t, x2 = s, and x1 = -s + 2t. Thus, the solution has two linearly independent eigenvectors for λ2 = -1 with r x = x1 x2 x3           = −s + 2t s t           = s −1 1 0           + t 2 0 1           ,s ≠ 0,t ≠ 0.
  • 32. 32 If we choose r p2 = −1 1 0           , and r p3 = 2 0 1           , then B2 = −1 1 0           , 2 0 1                     is a basis for Eλ2 = span({ r p2 , r p3}) and dim(Eλ2 ) = 2, so the geometric multiplicity is 2.
  • 33. 33 Thus, we have AP = PD as follows : −4 −3 6 0 −1 0 −3 −3 5           1 −1 2 0 1 0 1 0 1           = 1 −1 2 0 1 0 1 0 1           2 0 0 0 −1 0 0 0 −1           2 1 −2 0 −1 0 2 0 −1           = 2 1 −2 0 −1 0 2 0 −1           . Since the geometric multiplicity is equal to the algebraic multiplicity for each distinct eigenvalue, we found three linearly independent eigenvectors. The matrix A is diagonalizable since P = [p1 p2 p3] is nonsingular.
  • 34. 34 P | I[ ]= 1 −1 2 0 1 0 1 0 1 1 0 0 0 1 0 0 0 1           : 1 −1 2 0 1 0 0 1 −1 1 0 0 0 1 0 −1 0 1           : 1 0 2 0 1 0 0 0 1 1 1 0 0 1 0 1 1 −1           : 1 0 0 0 1 0 0 0 1 −1 −1 2 0 1 0 1 1 −1           . So, P−1 = −1 −1 2 0 1 0 1 1 −1           . We can find P-1 as follows:
  • 35. 35 AP = PD gives us A = APP−1 = PDP−1 . Thus, PDP−1 = 1 −1 2 0 1 0 1 0 1           2 0 0 0 −1 0 0 0 −1           −1 −1 2 0 1 0 1 1 −1           = 2 1 −2 0 −1 0 2 0 −1           −1 −1 2 0 1 0 1 1 −1           = −4 −3 6 0 −1 0 −3 −3 5           = A Note that A and D are similar matrices.
  • 36. 36 D = −1 −1 2 0 1 0 1 1 −1           −4 −3 6 0 −1 0 −3 −3 5           1 −1 2 0 1 0 1 0 1           = −2 −2 4 0 −1 0 −1 −1 1           1 −1 2 0 1 0 1 0 1           = 2 0 0 0 −1 0 0 0 −1           . Also, D = P-1 AP = So, A and D are similar with D = P-1 AP and A = PD P-1 .
  • 37. 37 If P is an orthogonal matrix, its inverse is its transpose, P-1 = PT . Since PT P = r p1 T r p2 T r pT 3             r p1 r p2 r p3     == r p1 T r p1 r p1 T r p2 r p1 T r p3 r pT 2 r p1 r p2 T r p2 r p2 T r p3 r p3 T r p1 r p3 T r p2 r pT 3 r p3             = r p1 ⋅ r p1 r p1 ⋅ r p2 r p1 ⋅ r p3 r p2 ⋅ r p1 r p2 ⋅ r p2 r p2 ⋅ r p3 r p3 ⋅ r p1 r p3 ⋅ r p2 r p3 ⋅ r p3           = r pi ⋅ r pj   = I because r pi ⋅ r pj = 0 for i ≠ j and r pi ⋅ r pj = 1 for i = j = 1,2,3. So, P−1 = PT .
  • 38. 38 A is a symmetric matrix if A = AT . Let A be diagonalizable so that A = PDP-1 . But A = AT and AT = (PDP−1 )T = (PDPT )T = (PT )T DT PT = PDPT = A. This shows that for a symmetric matrix A to be diagonalizable, P must be orthogonal. If P-1 ≠ PT , then A ≠AT . The eigenvectors of A are mutually orthogonal but not orthonormal. This means that the eigenvectors must be scaled to unit vectors so that P is orthogonal and composed of orthonormal columns.
  • 39. 39 Example 5: Determine if the symmetric matrix A is diagonalizable; if it is, then find the orthogonal matrix P that orthogonally diagonalizes the symmetric matrix A. Let A = 5 −1 0 −1 5 0 0 0 −2           , then det(λI − A) = λ − 5 1 0 1 λ − 5 0 0 0 λ + 2 = (−1)3+1 (λ + 2) λ − 5 1 1 λ − 5 = (λ + 2)(λ − 5)2 − (λ + 2) = λ3 − 8λ2 + 4λ + 48 = (λ − 4)(λ2 − 4λ −12) = (λ − 4)(λ + 2)(λ − 6) = 0 Thus, λ1 = 4, λ2 = −2, λ3 = 6. Since we have three distinct eigenvalues, we will see that we are guaranteed to have three linearly independent eigenvectors.
  • 40. 40 Since λ1 = 4, λ2 = -2, and λ3 = 6, are distinct eigenvalues, each of the eigenvalues has algebraic multiplicity 1. An eigenvalue must have geometric multiplicity of at least one. Otherwise, we will have the trivial solution. Thus, we have three linearly independent eigenvectors. We will use Gaussian elimination with back-substitution as follows:
  • 41. 41 For λ1 = 4 , λ1I − A r 0    = −1 1 0 1 −1 0 0 0 6 0 0 0           : 1 −1 0 0 0 0 0 0 1 0 0 0           : 1 −1 0 0 0 1 0 0 0 0 0 0           x2 = s, x3 = 0, x1 = s . r x = x1 x2 x3           = s 1 1 0           or r p1 = 1/ 2 1/ 2 0           .
  • 42. 42 For λ2 = −2 , λ2 I − A r 0    = −7 1 0 1 −7 0 0 0 0 0 0 0           : 1 −7 0 0 1 0 0 0 0 0 0 0           x3 = s, x2 = 0, x1 = 0 . r x = x1 x2 x3           = s 0 0 1           or r p2 = 0 0 1           .
  • 43. 43 For λ3 = 6, λ3I − A r 0    = 1 1 0 1 1 0 0 0 8 0 0 0           : 1 1 0 0 0 1 0 0 0 0 0 0           x2 = s, x3 = 0, x1 = −s. r x = x1 x2 x3           = s −1 1 0           or r p3 = −1/ 2 1/ 2 0           .
  • 44. 44 As we can see the eigenvectors of A are distinct, so {p1, p2, p3} is linearly independent, P-1 exists for P =[p1 p2 p3] and Thus A is diagonalizable. Since A = AT (A is a symmetric matrix) and P is orthogonal with approximate scaling of p1, p2, p3, P-1 = PT . AP = PD ⇔ PDP−1 . PP−1 = PPT = 1/ 2 0 −1/ 2 1/ 2 0 1/ 2 0 1 0           1/ 2 1/ 2 0 0 0 1 −1/ 2 1/ 2 0           = 1 0 0 0 1 0 0 0 1           = I.
  • 45. 45 As we can see the eigenvectors of A are distinct, so {p1, p2, p3} is linearly independent, P-1 exists for P =[p1 p2 p3] and Thus A is diagonalizable. Since A = AT (A is a symmetric matrix) and P is orthogonal with approximate scaling of p1, p2, p3, P-1 = PT . AP = PD ⇔ PDP−1 . PP−1 = PPT = 1/ 2 0 −1/ 2 1/ 2 0 1/ 2 0 1 0           1/ 2 1/ 2 0 0 0 1 −1/ 2 1/ 2 0           = 1 0 0 0 1 0 0 0 1           = I.
  • 46. 46 PDPT = 1/ 2 0 −1/ 2 1/ 2 0 1/ 2 0 1 0           4 0 0 0 −2 0 0 0 6           1/ 2 1/ 2 0 0 0 1 −1/ 2 1/ 2 0           = 4 / 2 0 −6 / 2 4 / 2 0 6 / 2 0 −2 0           1/ 2 1/ 2 0 0 0 1 −1/ 2 1/ 2 0           = 5 −1 0 −1 5 0 0 0 −2           = A. Note that A and D are similar matrices. PD P-1 =
  • 47. 47 = 1/ 2 1/ 2 0 0 0 1 −1/ 2 1/ 2 0           5 −1 0 −1 5 0 0 0 −2           1/ 2 0 −1/ 2 1/ 2 0 1/ 2 0 1 0           = 4 / 2 4 / 2 0 0 0 −2 −6 / 2 6 / 2 0           1/ 2 0 −1/ 2 1/ 2 0 1/ 2 0 1 0           = 4 0 0 0 −2 0 0 0 6           . Also, D = P-1 AP = PT AP So, A and D are similar with D = PT AP and A = PDPT .
  • 48. 48 = 1/ 2 0 −1/ 2 1/ 2 0 1/ 2 0 1 0           4 0 0 0 −2 0 0 0 6           T 1/ 2 0 −1/ 2 1/ 2 0 1/ 2 0 1 0           T = 1/ 2 0 −1/ 2 1/ 2 0 1/ 2 0 1 0           4 0 0 0 −2 0 0 0 6           1/ 2 1/ 2 0 0 0 1 −1/ 2 1/ 2 0           = 4 / 2 0 −6 / 2 4 / 2 0 6 / 2 0 −2 0           1/ 2 1/ 2 0 0 0 1 −1/ 2 1/ 2 0           = 5 −1 0 −1 5 0 0 0 −2           AT = (PD P-1 )T = (PD PT )T = (PT )T DT PT = P DT PT = A. This shows that if A is a symmetric matrix, P must be orthogonal with P-1 = PT .
  • 49. 49 From Example 1, the diagonal matrix for matrix A is : D = λ1 0 0 λ2         = −5 0 0 2       = P−1 AP = D ⇒ A = PDP−1 , and A3 = PDP−1 PDP−1 PDP−1 = PD3 P−1 = 1 −2 1 5       (−5)3 0 0 (2)3         5 / 7 2 / 7 −1/ 7 1/ 7       = 125 −16 125 40       5 / 7 2 / 7 −1/ 7 1/ 7       = 641/ 7 234 / 7 585 / 7 290 / 7      . For A3 ,the eigenvalues, are λ1 3 = −125 aand λ2 3 = 8. In general, the power of a matrix, Ak = PDk P−1 . and the eigenvalues are λi k , where λi is on the main diagonal of D.
  • 50. 50