SlideShare ist ein Scribd-Unternehmen logo
1 von 137
J.Baskar Babujee
Department of Mathematics
Anna University, Chennai-600 025.
MA6151MA6151
CONTENT
1.1 INTRODUCTION
1.2 TYPES OF MATRICES
1.3 CHARACTERISTIC EQUATION
1.4 EIGEN VECTORS
1.5 PROBLEMS
1.6 CAYLEY HAMILTON THEOREM
2
1.7 DIAGONALISATION OF A MATRIX
1.8 REDUCTION OF A MATRIX TO DIAGONAL
FORM
1.9 ORTHOGONAL TRANSFORMATION OF A
SYMMETRIC MATRIX TO DIAGONAL FORM
1.10 QUADRATIC FORMS
1.11 NATURE OF A QUADRATIC FORM
3
1.12 REDUCTION OF QUADRATIC FORM TO
CANONICAL FORM
1.13 INDEX AND SIGNATURE OF THE QUADRATIC
FORM
1.14 LINEAR TRANSFORMATION OF A
QUADRATIC FORM.
1.15 REDUCTION OF QUADRATIC FORM TO
CANANONICAL FORM BY ORTHOGONAL
TRANSFORMATION
4
1.1 INTRODUCTION:-
A matrix is defined as a rectangular array (or
arrangement in rows or columns) of scalar subject
to certain rules of operations.
If mn numbers (real or complex) or functions
are arranged in the columns (vertical lines) then A
is called an m n matrix. Each of the mn numbers
is called an element of the matrix.
×
Unit 1 MATRICES
An m n matrix is usually written as
A matrix is usually denoted by a single capital letter A, B or C
etc.
11 12 13 1
21 22 23 2
31 32 33 3
1 2 3
....
....
....
.... .... .... .... ....
....
n
n
n
m m m mn
a a a a
a a a a
a a a a
a a a a
 
 
 
 
 
 
  
×
7
Thus, an m n matrix A may be written as
A = , where i = 1, 2, 3, … , m ;
j = 1, 2, 3, … , n
In Algebra, the matrices have their largest
application in the theory of simultaneous
equations and linear transformations.
×
ija  
E.g., The set of simultaneous equations
3333232131
2323222121
1313212111
bxaxaxa
bxaxaxa
bxaxaxa
=++
=++
=++
may be symbolically represented by the equation
Where A = , X = , B =










333231
232221
131211
aaa
aaa
aaa
A X = B










3
2
1
x
x
x










3
2
1
b
b
b






−
−
720
135
1.2 TYPES OF MATRICES
(1) Real Matrix :- A matrix is said to be
real if all its elements are real numbers.
E.g., is a real matrix.
(2) Square Matrix:- A matrix in which the number
of rows is equal to the number of columns is
called a square matrix, otherwise, it is said to be
rectangular matrix.
i.e., a matrix A = is a square matrix if m = n
a rectangular matrix if m n
ij m n
a
×
  
≠
A square matrix having n rows and n columns is
called “ n – rowed square matrix”,
is a 3 – rowed square matrix
The elements of a square matrix are
called its diagonal elements and the diagonal
along with these elements are called principal or
leading diagonal.
11 12 13
21 22 23
31 32 33
a a a
a a a
a a a
 
 
 
  
33,2211 aa,a
The sum of the diagonal elements of a
square matrix is called its trace or spur.
Thus, trace of the n rowed square matrix
A= isija  
∑
=
=++++
n
1i
ijnn332211 aa....aaa
(3) Row Matrix :-
A matrix having only one row and any
number of columns,
i.e., a matrix of order 1 n is called a row
matrix.
[2 5 -3 0] is a row matrix.
×
Example:-
(4) Column Matrix:-
A matrix having only one column
and any number of rows,
ii.e., a matrix of order m 1 is called a column
matrix.
is a column matrix.










− 1
0
2
×
Example:-
(5) Null Matrix:-
A matrix in which each element is zero
is called a null matrix or void matrix or zero matrix.
A null matrix of order m n is denoted by
=
16
× nmO ×
Example :-






0000
0000
42O ×
(6) Sub - matrix :-
A matrix obtained from a given matrix A by
deleting some of its rows or columns or both is
called a sub – matrix of A.
is a sub – matrix of






41
03










−
−
2461
7053
5210
Example:-
(7) Diagonal Matrix :-
A square matrix in which all non – diagonal
elements are zero is called a diagonal matrix.
i.e., A = [a ] is a diagonal matrix if a = 0 for i j.
is a diagonal matrix.
ij nn× ij ≠
Example:-










−
000
010
002
(8) Scalar Matrix:-
A diagonal matrix in which all the diagonal
elements are equal to a scalar, say k, is called a
scalar matrix.
i.e., A = [a ] is a scalar matrix if
is a scalar matrix.










200
020
002
ij nn× 


=
≠
=
jiwhenk
jiwhen0
aij
Example :-
(9) Unit Matrix or Identity Matrix:-
A scalar matrix in which each diagonal
element is 1 is called a unit or identity matrix. It is
denoted by .
i.e., A = [a ] is a unit matrix if
is a unit matrix.
nI
ij nn× 


=
≠
=
jiwhen1
jiwhen0
aij
Example






10
01
(10) Upper Triangular Matrix.
A square matrix in which all the elements
below the principal diagonal are zero is called an
upper triangular matrix.
i.e., A = [a ] is an upper triangular matrix if a = 0
for i > j
is an upper triangular
matrix









−
300
510
432
ij nn× ij
Example:-
(11) Lower Triangular Matrix.
A square matrix in which all the elements
above the principal diagonal are zero is called a
lower triangular matrix.
i.e., A = [a ] is a lower triangular matrix if a = 0
for i < j
is a lower triangular
matrix.
ij nn× ij
Example:-









−
023
065
001
(12) Triangular Matrix:-
A triangular matrix is either upper
triangular or lower triangular.
(13) Single Element Matrix:-
A matrix having only one element is
called a single element matrix.
i.e., any matrix [3] is a single element matrix.
(14) Equal Matrices:-
Two matrices A and B are said to be equal iff
they have the same order and their corresponding
elements are equal.
i.e., if A = and B = , then A = B
iff a) m = p and n = q
b) a = b for all i and j.
qpij]b[ ×nmij]a[ ×
ij ij
(15) Singular and Non – Singular Matrices:-
A square matrix A is said to be singular if |
A| = 0 and non – singular if |A| 0.
A = is a singular
matrix since |A| = 0.










−
−
−
110
432
432
≠
Example :-
1.3 CHARACTERISTIC EQUATION
If A is any square matrix of order n, we
can form the matrix , where is the nth
order unit matrix.
The determinant of this matrix equated to zero,
i.e.,
IλA − I
0
λa...aa
............
a...λaa
a...aλa
λA
nnn2n1
2n2221
1n1211
=
−
−
−
=− I
is called the characteristic equation of A.
On expanding the determinant, we get
where k’s are expressible in terms of the elements a
The roots of this equation are called Characteristic
roots or latent roots or eigen values of the matrix A.
0k...λkλkλ1)( n
2n
2
1n
1
nn
=++++− −−
ij
1.4 EIGEN VECTORS
Consider the linear transformation Y = AX ...(1)
which transforms the column vector X into the
column vector Y. We often required to find those
vectors X which transform into scalar multiples of
themselves.
Let X be such a vector which transforms into
X by the transformation (1).
λ
Then Y = X ... (2)
From (1) and (2), AX = X AX- X = 0
(A - )X = 0 ...(3)
This matrix equation gives n homogeneous linear
equations
...
(4)
I⇒
λ
λ λ
λ







=−+++
=++−+
=+++−
0λ)x(a...xaxa
................
0xa...λ)x(axa
0xa...xaλ)x(a
nnn2n21n1
n2n222121
n1n212111
⇒ I
These equations will have a non-trivial solution only
if the co-efficient matrix A - is singular
i.e., if |A - | = 0 ... (5)
Corresponding to each root of (5), the
homogeneous system (3) has a non-zero solution
X = is called an eigen vector or latent
vector
λI
Iλ












4
2
1
x
...
x
x
31
Properties of Eigen Values:-Properties of Eigen Values:-
1.The sum of the eigen values of a matrix is the
sum of the elements of the principal diagonal.
2.The product of the eigen values of a matrix A is
equal to its determinant.
3.If is an eigen value of a matrix A, then 1/ is
the eigen value of A-1
.
4.If is an eigen value of an orthogonal matrix, then
1/ is also its eigen value.
λ λ
λ
λ
32
PROPERTY 1:-PROPERTY 1:- If λ1, λ2,…, λn are the eigen values of
A, then
i. k λ1, k λ2,…,k λn are the eigen values of the matrix
kA, where k is a non – zero scalar.
ii. are the eigen values of the inverse
matrix A-1
.
iii. are the eigen values of Ap
, where p
is any positive integer.
n21 λ
1
,...,
λ
1
,
λ
1
p
n
p
2
p
1 λ...,,λ,λ
33
Proof:-
i. Let λr be an eigen value of A and Xr the
corresponding eigen vector.
Then, by definition,
Multiplying both sides by k,
(kA)Xr = (kλr)Xr
Then k λr is an eigen value of kA and the
corresponding eigen vector is the same as that of
λr, namely Xr.
AXr = λrXr
34
ii. Pre multiplying both sides of AXr = λrXr by A-1
A-1
(A Xr) = A-1
(λr Xr )
Xr = λr (A-1
Xr )
=> A-1
Xr = (Xr)
Hence is an eigen value of A-1
and the
corresponding eigen vector is the same as that of
λr , namely Xr.
rλ
1
rλ
1
35
iii. Pre multiplying both sides of AXr = λrXr by A
A(A Xr) = A(λr Xr )
A2
Xr = λr (AXr )
= λr (λr xr)
= λr
2
Xr.
Similarly, we can prove that A3
Xr = λr
3
Xr, …,
Ap
Xr = λr
p
Xr, where p is any positive integer. Hence λr
p
is any eigen value of Ap
and the corresponding
eigen vector is the same as that of λr, namely Xr.
36
THEOREM :-
A matrix A is singular if and only if 0 is an eigen
value of A.
1.5 PROBLEMS
1. Find the sum and product of the eigen values of
the matrix
without finding the eigen values.










−−
−
−−
=
021
612
322
A
37
Solution:-
Sum of the eigen values of A = sum of its diagonal
elements.
= - 2 + 1 + 0
= -1.
Product of the eigen values of A = | A |
=
= 45.
021
612
322
−−
−
−−
38
2. Two eigen values of the matrix
are equal to 1 each. Find the third eigen value.
Solution:- Let a be the third eigen value of A.
Since sum of the eigen values = sum of the diagonal
elements,
1 + 1 + a = 2 + 3 + 2
a = 5
Therefore, the third eigen value of A is 5.










=
221
131
122
A
39
3. The product of two eigen values of the matrix
is 16. Find the third eigen value.
Solution:-
Let a be the third eigen value of A.
Since product of the eigen values = | A |
16a =
Therefore, a = 2.










−
−−
−
=
312
132
226
A
312
132
226
−
−−
−
40
4. Find the sum of the eigen values of the inverse of
Solution:-
The eigen values of the lower triangular matrix
A is 1, -3, 2. Then the eigen values of A-1
are
Sum of the eigen values of A-1
=
=










−=
250
032
001
A
.
2
1
,
3
1
,1 −
.
2
1
3
1
1 +
−
+
6
7
41
5. If , find the eigen values of 3A, A-1
and – 2A-1
.
Solution:-
The eigen values of A are 2, -1, 4.
The eigen values of 3A are 3×2, 3×(-1), 3×4
i.e., 6, -3, 12










−=
400
210
572
A
42
The eigen values of A-1
are
4
1
1,,
2
1
i.e.,
4
1
,
1
1
,
2
1
−
−
2
1
2,1,i.e.,
4
1
21),2)((,
2
1
2
are2AofvalueseigenThe 1
−−






−−−





−
− −
43
6. Find the eigen values and eigen vectors of the
matrix
A =
Solution:- The characteristic equation of the given
matrix is






−
−
45
21
0|IA| =λ−
or
Thus,
Corresponding to =6, the eigen vectors are given by
(A – 6 ) X1 = 0
1,6
0652
−=λ=>
=−λ−λ=>
0
45
21
=
λ−−
−λ−
λ
I
0
x
x
25
25
or0
x
x
645
261
or
2
1
2
1
=











−−
−−
=











−−
−−
the eigen values of A are 6, -1.
We get only one independent equation - 5x1 - 2x2 = 0






−
=∴
−=
=
=
−
=⇒
5
2
kXarevectorseigenThe
5kx
2kx
(say)k
5
x
2
x
11
12
11
1
21
Corresponding to = -1, the eigen vectos are given
by ( A + ) X2 = 0
λ
I






=∴
==
==⇒
=−⇒






=











−
−
⇒
1
1
kXarevectorseigenThe
kx,kx
(say)k
1
x
1
x
0xx
0
0
x
x
55
22
22
2221
2
21
21
2
1
7. Find the eigen values and eigen vectors of the
matrix A =
Solution:- The characteristic equation of the given
matrix is










−−
−
−−
021
612
322
0|IA| =λ−
0
21
612
322
=
λ−−−
−λ−
−λ−−
Thus,
53,3,λ
05)3)(λ3)(λ(λ
015)2λ3)(λ(λ
it.satisfies3λtrial,By
04521λλλ
0λ)]1(143[6]2λ2[12]λ)λ(1λ)[2(
2
23
−−=⇒
=−++⇒
=−−+∴
−=
=−−+⇒
=−+−−−−−−−−−−⇒
the eigen values of A are -3, -3, 5.
Corresponding to = - 3, the eigen vectors are
given by
λ
( )









−
+










=









 −
=
∴
−===
=−+
=




















−−
−
−−
=+
0
1
2
k
1
0
3
k
k
k
k2k3
0x3x2
0
x
x
x
321
642
321
or0XI
21
1
2
21
32
3
2
1
1
1
2112213
1
X
bygivenarevectorseigenThe
2k3kxthenkx,kxLet
xequationtindependenoneonlygetWe
3A
Corresponding to λ = 5, the eigen vectors are
given by (A – 5 I )X2 = 0.
052
032
0327
0
0
0
521
642
327
321
321
321
3
2
1
=−−−
=−−
=−+−⇒










=




















−−−
−−
−−
⇒
xxx
xxx
xxx
x
x
x
51










−
=
−===∴
=
−
==⇒
−−
=
+
=
1
2
1
kX
kx2kx,kx
(say)k
1
x
2
x
1
x
22
x
53
x
6-10
x
32
333,231
3
321
321
bygivenarevectorseigentheHence
,equations.twofirstFrom
8. Find the eigen values and eigen vectors of the
matrix
Solution:- The characteristic equation is










−
−−
−
=
342
476
268
A
0|IA| =λ−
.arevalueseigenHence
3,150,λ
045λ18λλ
0
λ342
4λ76
26λ8
23
15,3,0
=⇒
=−+−⇒
=
−−
−−−
−−
⇒
53
Eigen vector X1 corresponding to λ1= 0 is given by
(3)...03xx42x
(2)...04x7x6x
(1)...02x6x8x
0
0
0
x
x
x
342
476
268
x
x
x
Xwhere0,I)Xλ(A
321
321
321
3
2
1
3
2
1
111
=+−
=−+−
=+−⇒










=










−
−−
−
⇒










==−
54
Solving equations (1) and (2) by cross-multiplication,
we get
which satisfy equation (3) also.
Required eigen vector corresponding to λ1 = 0 is
131211
11
321
321
2kx,2kx,kx
0kwhere(say)k
2
x
2
x
1
x
3656
x
3212
x
1424
x
===
≠===⇒
−
=
+−
=
−










=










=
2
2
1
2
2 1
1
1
1
1 k
k
k
k
X
∴
55
Eigen vector X2 corresponding to λ2= 3 is given by
(6)...0x42x
(5)...04x4x6x
(4)...02x6x5x
0
0
0
x
x
x
042
446
265
x
x
x
Xwhere0,I)Xλ(A
21
321
321
3
2
1
3
2
1
222
=−
=−+−
=+−⇒










=










−
−−
−
⇒










==−
56
Solving equations (4) and (5) by cross-
multiplication, we get
which satisfy equation (6) also.
Required eigen vector corresponding to λ2 = 3 is
232221
22
321
321
2kx,kx,kx
0kwhere(say)k
2-
x
1
x
2
x
3620
x
2012
x
824
x
−===
≠===⇒
−
=
+−
=
−
2










−
=










−
=
2
1
2
2
2
2
2
2
2
2 k
k
k
k
X
∴
57
Eigen vector X3 corresponding to λ3= 15 is given by
(9)...012xx42x
(8)...04x8x-6x
(7)...02x6x7x-
0
0
0
x
x
x
12-42
48-6
267-
x
x
x
Xwhere0,I)Xλ(A
321
321
321
3
2
1
3
2
1
333
=−−
=−−
=+−⇒










=










−
−−
−
⇒










==−
58
Solving equations (7) and (8) by cross-multiplication,
we get
which satisfy equation (9) also.
Required eigen vector corresponding to λ3 = 15 is
333231
33
321
321
kx,kx,kx
0kwhere(say)k
1
x
2-
x
2
x
20
x
40-
x
40
x
=−==
≠===⇒
==
22










−=










−=
1
2
2
2
2
3
3
3
3
3 k
k
k
k
X
59
9. Find the eigen values and eigen vectors of the
matrix
Solution:- The characteristic equation is










−
−−
−
=
312
132
226
A
0|IA| =λ−
8.2,2,arevalueseigenHence
82,2,λ
03236λ12λλ
0
λ312
1λ32
22λ6
23
=⇒
=+−+−⇒
=
−−
−−−
−−
⇒
60
Eigen vector corresponding to is
given by
221 == λλ
0xx2x
0xx2x
02x2x4x
0
0
0
x
x
x
112
112
224
x
x
x
Xwhere0,I)Xλ(A
321
321
321
3
2
1
3
2
1
111
=+−
=−+−
=+−⇒










=










−
−−
−
⇒










==−
These equations are equivalent to a single
equation … (1)
Let x3 = 2 k3 and x2 = 2 k2 then from (1)
2x1 – 2 k2 + 2 k3 = 0
=> x1 = k2 – k3
Required eigen vector corresponding to
is
02 321 =+− xxx
∴
21 2 λλ ==









−
+










=









 −
=
2
0
1
0
2
1
2
2 32
3
2
32
1 kk
k
k
kk
X
Similarly, the eigen vector corresponding to = 8 is
given by,
3λ
....(4)05xx2x
....(3)0x5x2x
....(2)02x2x2x
0
0
0
x
x
x
512
152
222
0)X8-(A
x
x
x
Xwhere0I)Xλ-(A
321
321
321
3
2
1
3
3
2
1
333
=−−
=−−−
=+−−⇒










=




















−−
−−−
−−
⇒
=⇒










==
Solving equations (2) and (3) by cross-multiplication,
we get
which satisfy equation (4) also.
Required eigen vector corresponding to λ3 = 8 is
131211
1
321
321
,,2
)(
112
6612
kxkxkx
sayk
xxx
xxx
=−==
==
−
=⇒
=
−
=
∴










−=










−=
1
1
22
1
1
1
1
3 k
k
k
k
X
64
Show that if λ1,λ2, … λn are the latent roots of the
matrix A, then A3
has the latent roots
Solution:- Let λ be a latent root of the matrix A.
Then there exists a non – zero vector X such that
....,,, 33
2
3
1 nλλλ
Example 1:-
65
A X = λ X … (1)
=> A2
(AX) = A2
(λ X)
=> A3
X = λ (A2
X) [using (1)]
But A2
X = A ( A X) = A (λ X)
= λ (AX) = λ (λX) = λ2
X
Therefore, A3
X = λ (λ2
X) = λ3
X
=> λ3
is a latent root of A3.
66
Therefore, If λ1,λ2, … λn are the latent roots of the
matrix A, then are the latent roots of
A3
.
If λ1,λ2, … λn are eigen values of A then find eigen
values of the matrix (A – λI)2
.
Solution:- (A – λI)2
= A2
– 2 λAI + λ2
I2
= A2
– 2 λA + λ2
I
33
2
3
1 ...,,, nλλλ
Example 2:-
67
Eigen values of A2
are
Eigen values of 2 λA are 2 λ λ1,2 λ λ2, …, 2 λ λn.
Eigen values of λ2
I are λ2
Eigen values of ( A – λI)2
are
.λ...,λ,λ 2
n
2
2
2
1
.λ)(λ,...,λ)(λ,λ)(λor
.λ2λλλ,...,λ2λλλ,λ2λλλ
2
n
2
2
2
1
2
n
2
n
2
2
2
2
2
1
2
1
−−−
+−+−+−
∴
68
Find the eigen values and eigen vectors of the matrix
Solution:- The characteristic equation is










=
500
620
413
A
5.2,3,arevalueseigenHence
52,3,λ
0λ)λ)(5λ)(2
0
λ500
6λ20
41λ3
=⇒
=−−−⇒
=
−
−
−
⇒
3(
0|IA| =λ−
Example 3:-
69
Eigen vector X1 corresponding to λ1= 3 is given by
0x0,x
02x
06xx
04xx
0
0
0
x
x
x
200
61-0
410
x
x
x
Xwhere0,)X3(A
23
3
32
32
3
2
1
3
2
1
11
==⇒
=
=+−
=+⇒










=










⇒










==− I
The characteristic vector corresponding to eigen
value λ1 = 3 is given by
When λ2 = 2, let X2 be the eigen vector then
(A – 2I) X2 = 0 where X2 = [x1 x2 x3]’
0k,kXwhere
0
0
k
x
x
x
X 111
1
3
2
1
1 ≠=










=










=










=




















⇒
0
0
0
300
600
411
3
2
1
x
x
x
71










−=










−=∴
=
−=
=∴
≠==
−
=
−==+∴
=⇒=
=
=++⇒
0
1
1
k
0
k
k
XisvectoreigenRequired
0x
kx
kx
0kwhere(say)k
0
x
1
x
1
x
xxor0xx
0x03x
06x
04xxx
22
2
2
3
22
21
22
321
2121
33
3
321
72
When λ3 = 5, let X3 be the eigen vector then
(A – 5I) X3 = 0 where X3 = [x1 x2 x3]’
Solving above equations by cross – multiplication,
we get
06x3x
04xx2x
0
0
0
x
x
x
000
630
412
32
321
3
2
1
=+−
=++−⇒










=




















−
−
⇒
73
Required eigen vector is
333231
33
321
321
kx,2kx,3kx
0kwhere(say)k
1
x
2
x
3
x
6
x
12
x
126
x
===⇒
≠===⇒
==
+










=










=
1
2
3
k
k
2k
3k
X 3
3
3
3
3
∴
1.6 CAYLEY HAMILTON
THEOREM
Every square matrix satisfies its own
characteristic equation.
Let A = [aij]n×n be a square matrix
then,
nnnn2n1n
n22221
n11211
a...aa
................
a...aa
a...aa
A
×












=
Let the characteristic polynomial of A be φ (λ)
Then,
The characteristic equation is
 
 
 
 
 
 
11 12 1n
21 22 2n
n1 n2 nn
φ(λ) = A - λI
a -λ a ... a
a a -λ ... a
=
... ... ... ...
a a ... a -λ
| A -λI|= 0
Note 1:- Premultiplying equation (1) by A-1
, we
have
⇒ n n-1 n-2
0 1 2 n
n n-1 n-2
0 1 2 n
We are to prove that
pλ +p λ +p λ +...+p = 0
p A +p A +p A +...+p I= 0 ...(1)
I
⇒
n-1 n-2 n-3 -1
0 1 2 n-1 n
-1 n-1 n-2 n-3
0 1 2 n-1
n
0 =p A +p A +p A +...+p +p A
1
A =- [p A +p A +p A +...+p I]
p
This result gives the inverse of A in terms of
(n-1) powers of A and is considered as a practical
method for the computation of the inverse of the
large matrices.
Note 2:- If m is a positive integer such that m > n
then any positive integral power Am
of A is linearly
expressible in terms of those of lower degree.
Verify Cayley – Hamilton theorem for the matrix
A = . Hence compute A-1
.
Solution:- The characteristic equation of A is










−
−−
−
211
121
112
tion)simplifica(on049λ6λλor
0
λ211
1λ21
11λ2
i.e.,0λIA
23
=−+−
=
−−
−−−
−−
=−
Example 1:-
To verify Cayley – Hamilton theorem, we have to
show that A3
– 6A2
+9A – 4I = 0 … (1)
Now,










−
−−
−−
=










−
−−
−










−
−−
−
=×=










−
−−
−
=










−
−−
−










−
−−
−
=
222121
212221
212222
211
121
112
655
565
556
655
565
556
211
121
112
211
121
112
23
2
AAA
A
A3
-6A2
+9A – 4I = 0
= - 6 + 9
-4
=
This verifies Cayley – Hamilton theorem.
∴










−
−−
−−
222121
212221
212222










−
−−
−
655
565
556










−
−−
−
211
121
112










100
010
001
0
000
000
000
=










81
Now, pre – multiplying both sides of (1) by A-1
, we
have
A2
– 6A +9I – 4 A-1
= 0
=> 4 A-1
= A2
– 6 A +9I










−
−
=∴










−
−
=










+










−
−−
−
−










−
−−
−
=⇒
−
−
311
131
113
4
1
311
131
113
100
010
001
9
211
121
112
6
655
565
556
4
1
1
A
A
82
Given find Adj A by using Cayley –
Hamilton theorem.
Solution:- The characteristic equation of the given
matrix A is










−
−
−
=
113
110
121
A
tion)simplifica(on035λ3λλor
0
λ113
1λ10
1-2λ1
i.e.,0λIA
23
=++−
=
−−
−−
−
=−
Example 2:-
83
By Cayley – Hamilton theorem, A should satisfy
A3
– 3A2
+ 5A + 3I = 0
Pre – multiplying by A-1
, we get
A2
– 3A +5I +3A-1
= 0










−
−
−
=










−
−−
−−
=










−
−
−










−
−
−
==
+−−=⇒
339
330
363
3A
146
223
452
113
110
121
113
110
121
A.AANow,
(1)...5I)3A(A
3
1
A
2
21-
84
AAAAdj.
A
AAdj.
Athat,knowWe
173
143
110
3
1
500
050
005
339
330
363
146
223
452
3
1
AFrom(1),
1
1
1
−
−
−
=∴
=










−
−
−−
−=




















+










−
−
−
−










−
−−
−−
−=∴
85










−
−
−−
=










−
−
−−






−−=∴
−=
−
−
−
=
173
143
110
AAdj.
173
143
110
3
1
3)(AAdj.
3
113
110
121
ANow,
86
1.7 DIAGONALISATION OF A
MATRIX
Diagonalisation of a matrix A is the
process of reduction A to a diagonal form.
If A is related to D by a similarity transformation,
such that D = M-1
AM then A is reduced to the
diagonal matrix D through modal matrix M. D is
also called spectral matrix of A.
87
1.8 REDUCTION OF A MATRIX
TO DIAGONAL FORM
If a square matrix A of order n has n linearly
independent eigen vectors then a matrix B can
be found such that B-1
AB is a diagonal matrix.
Note:- The matrix B which diagonalises A is called
the modal matrix of A and is obtained by
grouping the eigen vectors of A into a square
matrix.
88
Similarity of matrices:-
A square matrix B of order n is said to be a
similar to a square matrix A of order n if
B = M-1
AM for some non singular
matrix M.
This transformation of a matrix A by a non –
singular matrix M to B is called a similarity
transformation.
Note:- If the matrix B is similar to matrix A, then B
has the same eigen values as A.
89
Reduce the matrix A = to diagonal form by
similarity transformation. Hence find A3
.
Solution:- Characteristic equation is
=> λ = 1, 2, 3
Hence eigen values of A are 1, 2, 3.










−
−
300
120
211
0=










−
−
λ-300
1λ-20
21λ1-
Example:-
90
Corresponding to λ = 1, let X1 = be the eigen
vector then










3
2
1
x
x
x










=∴
===∴
=
=−
=+−⇒










=




















−
−
=−
0
0
1
kX
x0x,kx
02x
0xx
02xx
0
0
0
x
x
x
200
110
210
0X)I(A
11
3211
3
32
32
3
2
1
1
91
Corresponding to λ = 2, let X2 = be the eigen
vector then,










3
2
1
x
x
x










=∴
===∴
=
=−
=+−−⇒










=




















−
−
=−
0
1-
1
kX
x-kx,kx
0x
0x
02xxx
0
0
0
x
x
x
100
100
211-
0X)(A
22
32221
3
3
321
3
2
1
2
0,
I2
92
Corresponding to λ = 3, let X3 = be the eigen
vector then, 









3
2
1
x
x
x










=∴
−
===∴
=−−
=+−−⇒










=




















−
−
=−
2
2-
3
kX
xk-x,kx
0x
02xxx
0
0
0
x
x
x
000
11-0
212-
0X)(A
33
13332
3
321
3
2
1
3
3
2
2
3
,
2
I3
k
x
93
Hence modal matrix is














−==∴









 −
=
−=










−=
−
2
1
00
11-0
2
1-
11
M
MAdj.
M
1-00
220
122-
MAdj.
2M
200
21-0
311
M
1
94
Now, since D = M-1
AM
=> A = MDM-1
A2
= (MDM-1
) (MDM-1
)
= MD2
M-1
[since M-1
M = I]










=










−










−
−














−
−
=−
300
020
001
200
21-0
311
300
120
211
2
1
00
11-0
2
1
11
AMM 1
95
Similarly, A3
= MD3
M-1
=
A3
=
























−
−




















−
2700
19-80
327-1
2
1
00
11-0
2
1
11
2700
080
001
200
21-0
311
96
1.9 ORTHOGONAL
TRANSFORMATION OF A SYMMETRIC
MATRIX TO DIAGONAL FORM
A square matrix A with real elements is said to
be orthogonal if AA’ = I = A’A.
But AA-1
= I = A-1
A, it follows that A is orthogonal if
A’ = A-1
.
Diagonalisation by orthogonal transformation is
possible only for a real symmetric matrix.
97
If A is a real symmetric matrix then eigen
vectors of A will be not only linearly independent but
also pairwise orthogonal.
If we normalise each eigen vector and use
them to form the normalised modal matrix N then it
can be proved that N is an orthogonal matrix.
98
The similarity transformation M-1
AM = D takes
the form N’AN = D since N-1
= N’ by a property of
orthogonal matrix.
Transforming A into D by means of the
transformation N’AN = D is called as orthogonal
reduction or othogonal transformation.
Note:- To normalise eigen vector Xr, divide each
element of Xr, by the square root of the sum of the
squares of all the elements of Xr.
99
Diagonalise the matrix A = by means of an
orthogonal transformation.
Solution:-
Characteristic equation of A is
204
060
402
66,2,λ
0λ)16(6λ)λ)(2λ)(6(2
0
λ204
0λ60
40λ2
−=⇒
=−−−−−⇒
=
−
−
−
Example :-
100
I
 
 
 
  
     
     
     
          
⇒
∴
 
 ∴  
  
1
1 2
3
1
1
2
3
1 3
2
1 3
1 1 2 3 1
1 1
x
whenλ = -2,let X = x be theeigenvector
x
then (A + 2 )X = 0
4 0 4 x 0
0 8 0 x = 0
4 0 4 x 0
4x + 4x = 0 ...(1)
8x = 0 ...(2)
4x + 4x = 0 ...(3)
x = k ,x = 0,x = -k
1
X = k 0
-1
101
2
2I
0
 
 
 
  
     
     
     
          
⇒ −
∴
1
2
3
1
2
3
1 3
1 3
1 3 2
2 2 3
x
whenλ = 6,let X = x betheeigenvector
x
then (A -6 )X = 0
-4 0 4 x 0
0 0 x = 0
4 0 -4 x 0
4x +4x = 0
4x - 4x = 0
x = x and x isarbitrary
x must be so chosen that X and X are orthogonal among th
.1
emselves
and also each is orthogonal with X
102
   
   
   
      
∴
∴
 
 
 
  
∴
2 3
3 1
3 2
3
1α
Let X = 0 and let X =β
1γ
Since X is orthogonal to X
α - γ = 0 ...(4)
X is orthogonal to X
α + γ = 0 ...(5)
Solving (4)and(5), we getα = γ = 0 and β is arbitra ry.
0
Takingβ =1, X = 1
0
1 1 0
Modal matrix is M = 0 0 1
-1 1
 
 
 
  0
103
 
 
 
 
 
 
  
 
  
    
    
    
        
    
 
 
 
  
The normalised modal matrix is
1 1
0
2 2
N = 0 0 1
1 1
- 0
2 2
1 1
0 - 1 1
02 2 2 0 4 2 2
1 1
D =N'AN = 0 0 6 0 0 0 1
2 2
4 0 2 1 1
- 00 1 0
2 2
-2 0 0
D = 0 6 0 which is the required diagonal matrix
0 0 6
.
104
1.10 QUADRATIC FORMS
DEFINITION:-DEFINITION:-
A homogeneous polynomial of second degree
in any number of variables is called a quadratic
form.
For example,
ax2
+ 2hxy +by2
ax2
+ by2
+ cz2
+ 2hxy + 2gyz + 2fzx and
ax2
+ by2
+ cz2
+ dw2
+2hxy +2gyz + 2fzx + 2lxw +
2myw + 2nzw
are quadratic forms in two, three and four variables.
105
In n – variables x1,x2,…,xn, the general quadratic form
is
In the expansion, the co-efficient of xixj = (bij + bji).
∑∑= =
≠
n
1j
n
1i
jiijjiij bbwhere,xxb
).b(b
2
1
awherexxaxxb
baandaawherebb2aSuppose
jiijijji
n
1j
n
1i
ijji
n
1j
n
1i
ij
iiiijiijijijij
+==
==+=
∑∑∑∑ = == =
106
Hence every quadratic form can be written as
( ) ( )
getweform,matrixin
formsquadraticofexamplessaidabovethewritingNow
.x,...,x,xXandaAwhere
symmetric,alwaysisAmatrixthethatso
AX,X'xxa
n21ij
ji
n
1j
n
1i
ij
==
=∑∑= =












=++
y
x
bh
ha
y][xby2hxyax(i) 22
107
[ ]
[ ]
























=
+++++++++




















=+++++
w
z
y
x
dnml
ncgf
mgbh
lfha
wzyx
2nzw2myw2lxwzx2f2gyz2hxydw2czbyax(iii)
z
y
x
cgf
gbh
fha
zyx2fzx2gyz2hxyczbyax(ii)
222
222
108
1.11 NATURE OF A QUADRATIC
FORM
A real quadratic form X’AX in n variables is said
to be
i. Positive definite if all the eigen values of A > 0.
ii. Negative definite if all the eigen values of A < 0.
iii. Positive semidefinite if all the eigen values of A 0
and at least one eigen value = 0.
iv. Negative semidefinite if all the eigen values of
A 0 and at least one eigen value = 0.
v. Indefinite if some of the eigen values of A are + ve
and others – ve.
≥
≤
109
Find the nature of the following quadratic forms
i. x2
+ 5y2
+ z2
+ 2xy + 2yz + 6zx
ii. 3x2
+ 5y2
+ 3z2
– 2yz + 2zx – 2xy
Solution:-
i. The matrix of the quadratic form is










=
113
151
311
A
Example :-
110
The eigen values of A are -2, 3, 6.
Two of these eigen values being positive and
one being negative, the given quadratric form is
indefinite.
ii. The matrix of the quadratic form is
The eigen values of A are 2, 3, 6. All these eigen
values being positive, the given quadratic form
is positive definite.










−
−−
−
=
311
151
113
A
111
1.12 REDUCTION OF QUADRATIC
FORM TO CANONICAL FORM
A homogeneous expression of the second
degree in any number of variables is called a
quadratic form.
For instance, if
which is a quadratic form.
(i)....2hxy2gzx2fyzczbyaxAXX'then
],zyx[X'and
z
y
x
X,
cfg
fbh
gha
A
222
+++++=
=










=










=
112
Let λ1, λ2, λ3 be the eigen values of the matrix A
and
be its corresponding eigen vectors in the
normalized form (i.e., each element is divided by
square root of sum of the squares of all the three
elements in the eigen vector).










=










=










=
3
3
3
3
2
2
2
2
1
1
1
1
z
y
x
X,
z
y
x
X,
z
y
x
X
113
Then B-1
AB = D, a diagonal matrix.
Hence the quadratic form (i) is reduced to a sum of
squares (i.e., canonical form).
λ1x2
+ λ2y2
+ λ3z2
And B is the matrix of transformation which is an
orthogonal matrix.
Note:-
1. Here some of λi may be positive or negative or
zero
2. If ρ(A) = r, then the quadratic form X’AX will
contain only r terms.
114
1.13 INDEX AND SIGNATURE OF
THE QUADRATIC FORM
The number p of positive terms in the
canonical form is called the index of the quadratic
form.
(The number of positive terms) – ( the number of
negative terms)
i.e., p – (r – p) = 2p – r is called signature of the
quadratic form, where ρ(A) = r.
115
1.14 LINEAR TRANSFORMATION
OF A QUADRATIC FORM.
Let X’AX be a quadratic form in n- variables
and let X = PY ….. (1) where P is a non –
singular matrix, be the non – singular
transformation.
From (1), X’ = (PY)’ = Y’P’ and hence
X’AX = Y’P’APY = Y’(P’AP)Y
= Y’BY …. (2)
where B = P’AP.
116
Therefore, Y’BY is also a quadratic form in n-
variables. Hence it is a linear transformation of
the quadratic form X’AX under the linear
transformation X = PY and B = P’AP.
Note. (i) Here B = (P’AP)’ = P’AP = B
(ii) ρ(B) = ρ(A)
Therefore, A and B are congruent matrices.
117
Reduce 3x2
+ 3z2
+ 4xy + 8xz + 8yz into canonical
form.
Or
Diagonalise the quadratic form 3x2
+ 3z2
+ 4xy + 8xz
+ 8yz by linear transformations and write the
linear transformation.
Or
Reduce the quadratic form 3x2
+ 3z2
+ 4xy + 8xz +
8yz into the sum of squares.
Example:-
118
Solution:- The given quadratic form can be
written as X’AX where X = [x, y, z]’ and the
symmetric matrix
A =
Let us reduce A into diagonal matrix. We know tat
A = I3AI3.










344
402
423






























=










100
010
001
344
402
423
100
010
001
344
402
423
119
− −
−
   
   
    
    − = −     
      
   − −
   
21 31OperatingR ( 2 / 3),R ( 4 / 3)
(for A onL.H.S.andpre factor on R.H.S.),weget
3 2 4 1 0 0
1 0 0
4 4 2
0 1 0 A 0 1 0
3 3 3
0 0 1
4 7 4
0 0 1
3 3 3














−−
















−
−=
















−
−
−
−−
100
010
3
4
3
2
1
A
10
3
4
01
3
2
001
3
7
3
4
0
3
4
3
4
0
423
getweR.H.S),onfactorpostandL.H.S.onA(for
4/3)(C2/3),(COperating 3121
120














−−










−
−=










−
−
100
010
3
4
3
2
1
A
112
01
3
2
001
100
3
4
3
4
0
003
getwe(1),ROperating 32
APP'1,
3
4
3,Diagor
100
110
2
3
2
1
A
112
01
3
2
001
100
0
3
4
0
003
getwe(1),COperating 32
=





−−














−−










−
−=










−
−
121
The canonical form of the given quadratic form is
Here ρ(A) = 3, index = 1, signature = 1 – (2) = -1.
Note:- In this problem the non-singular
transformation which reduces the given quadratic
form into the canonical form is X = PY.
i.e.,




















−
−
=










3
2
1
112
01
3
2
001
y
y
y
z
y
x
[ ]
2
3
2
2
2
1
3
2
1
321
yy
3
4
3y
y
y
y
100
0
3
4
0
003
yyyAP)Y(P'Y'
−−=




















−
−
=
122
1.15 REDUCTION OF QUADRATIC
FORM TO CANANONICAL FORM BY
ORTHOGONAL TRANSFORMATION
Let X’AX be a given quadratic form. The modal
matrix B of A is the matrix whose columns are
characteristic vectors of A. If B represents the
orthogonal matrix of A,
123
then X = BY will reduce X’AX to Y’ diag(λ1, λ2,…, λn) Y,
where λ1, λ2,…, λn are characteristic values of A.
Note. This method works successfully if the
characteristic vectors of A are linearly independent
which are pairwise orthogonal.
124
Reduce 8x2
+ 7y2
+ 3z2
– 12xy + 4xz – 8yz into
canonical form by orthogonal reduction.
Solution:- The matrix of the quadratic form is










−
−−
−
=
342
476
268
A
Example 1:-
125
The characteristic roots of A are given by 0|| =− IλA
153,0,λ
015)3)(λλ(λor
0
λ342
4λ76
26λ8
=∴
=−−
=
−−
−−−
−−
.,.ei
126
Characteristic vector for λ = 0 is given by
[A – (0)I] X1 = 0
'
11
321
321
321
321
2)2,(1,kXvectoreigenthegiving
2
x
2
x
1
x
getwe,twofirstSolving
03x4x2x
04x7x6x
02x6x8xi.e.,
=
==
=+−
=−+−
=+−
127
When λ = 3, the corresponding characteristic
vector is given by [A – 3I] X2 = 0
i.e.,
Solving any two equations, we get X2 = k2 (2, 1, -2)’.
Similarly characteristic vector corresponding to
λ = 15 is X3 = k3 (2, -2, 1)’.
04x2x
04x4x6x
02x6x5x
21
321
321
=−
=−+−
=+−
128
Now, X1, X2, X3 are pairwise orthogonal
i.e., X1 . X2 = X2 . X3 = X3 . X1 = 0.
The normalised modal matrix is
















−
−=
3
1
3
2
3
2
3
2
3
2
3
1
3
2
3
1
3
2
B
∴
129
Now B is orthogonal matrix and 1B =










=
















−
−
















−
−
=== −−
1500
000
003
3
1
3
2
3
2
3
2
3
2
3
1
3
2
3
1
3
2
A
3
1
3
2
3
2
3
2
3
2
3
1
3
2
3
1
3
2
ie.,
15}0,{3,diagDABBandBBi.e., 1T1
130
which is the required canonical form.
Note. Here the orthogonal transformation is X =BY,
rank of the quadratic form = 2; index = 2,
signature = 2. It is positive definite.
[ ]
2
3
2
2
2
1
3
2
1
321
1
15y0.y3y
y
y
y
1500
000
003
yyy
DYY'AB)Y(BY'AXX'
++=




















=
== −
131
TEST YOUR KNOWLEDGE
1. If then find the eigen value of
2. Write the matrix of the Quadratic form
3. Obtain the characteristic equation of the matrix whose
eigen values are 1,-2 and 3.
4. A = , then find the eigen values of
3A3
+5A2
-6A+2I.
1 2 3
0 3 2
0 0 2
A
− 
 ÷
=  ÷
 ÷− 
( )
2
3A I−
1 2 3 1 2 32 2 2x x x x x x+ −
.










−
200
320
061
132
5. Write the quadratic form corresponding to the
following symmetric matrix
6. Find the sum of the eigen values of the inverse of.
7. Obtain the latent roots of A4
where A =
8. If A is idempotent matrix then A2
=A. What will be
the eigen values of A.










−
−
342
411
210
1 0 0
2 3 0
0 5 2
 
 ÷
− ÷
 ÷
 
5 4
1 2
 
 ÷
 
133
9. If are the eigen values of the matrix A,
whose characteristic equation is
Obtain using the property.
10. (i) Using Cayley-Hamilton Theorem find the
inverse of the matrix
(ii) Find the Characteristic roots and
Characteristic vectors of the matrix
1 2 2 3 3 1λ λ λ λ λ λ+ +
3 2
21 45 0λ λ λ+ − − =
1 2 3, andλ λ λ
1 0 3
2 1 1
1 1 1
A
 
 ÷
= − ÷
 ÷− 
1 1 3
1 5 1
3 1 1
A
 
 ÷
= ÷
 ÷
 
134
11. Reduce the quadratic form
to the canonical form by orthogonal transformation.
Also specify the matrix of transformation. Obtain
its index, signature and nature of the quadratic
form.
12. (i) Find the eigen value and eigen vector of the
matrix
(ii) Using Cayley Hamiltonian find the inverse of
2 2 2
5 2 2 6x y z xy yz xz+ + + + +










−−
−−
−−
6410
527
7411










−
−
111
112
301
135
13. Discuss the nature, index and signature of the
quadratic form
14. Diagonalise the matrix by
orthogonal reduction and provide the normalized
modal matrix.
15. Reduce the quadratic form 2x1x2+2x1x3-2x2x3 to the
canonical form by an Orthogonal transformation.
2 2 2
10 2 5 6 10 4x y z yz zx xy+ + + − −
8 6 2
6 7 4
2 4 3
A
− 
 ÷
= − − ÷
 ÷− 
136
16. (i) Find the eigen value and eigen vector of the
matrix
(ii) For A = , compute the value of
, Using Cayley-
Hamilton theorem.
17. Reduce the quadratic form 3x1
2
+ 5x2
2
+ 3x3
2
-2x2x3+2x3x1-2x1x2 to a Canonical form by orthogonal
reduction.
2 1 0
0 2 1
0 0 2
 
 ÷
 ÷
 ÷
 










−
−
111
112
301
6 5 4 3 2
5 8 2 9 31 36A A A A A A I− + − − + −
THANK YOU

Weitere ähnliche Inhalte

Was ist angesagt?

system linear equations and matrices
 system linear equations and matrices system linear equations and matrices
system linear equations and matricesAditya Vaishampayan
 
Complex Numbers
Complex NumbersComplex Numbers
Complex Numbersitutor
 
1st order differential equations
1st order differential equations1st order differential equations
1st order differential equationsNisarg Amin
 
Exponential and logarithmic functions
Exponential and logarithmic functionsExponential and logarithmic functions
Exponential and logarithmic functionsNjabulo Nkabinde
 
Diagonalization of Matrices
Diagonalization of MatricesDiagonalization of Matrices
Diagonalization of MatricesAmenahGondal1
 
Ppt on matrices and Determinants
Ppt on matrices and DeterminantsPpt on matrices and Determinants
Ppt on matrices and DeterminantsNirmalaSolapur
 
Linear algebra-Basis & Dimension
Linear algebra-Basis & DimensionLinear algebra-Basis & Dimension
Linear algebra-Basis & DimensionManikanta satyala
 
Maths-->>Eigenvalues and eigenvectors
Maths-->>Eigenvalues and eigenvectorsMaths-->>Eigenvalues and eigenvectors
Maths-->>Eigenvalues and eigenvectorsJaydev Kishnani
 
Eigen values and eigenvectors
Eigen values and eigenvectorsEigen values and eigenvectors
Eigen values and eigenvectorsAmit Singh
 
Matrix inverse
Matrix inverseMatrix inverse
Matrix inversemstf mstf
 
Matrices and System of Linear Equations ppt
Matrices and System of Linear Equations pptMatrices and System of Linear Equations ppt
Matrices and System of Linear Equations pptDrazzer_Dhruv
 
MATRICES AND ITS TYPE
MATRICES AND ITS TYPEMATRICES AND ITS TYPE
MATRICES AND ITS TYPEHimanshu Negi
 
Inverse Matrix & Determinants
Inverse Matrix & DeterminantsInverse Matrix & Determinants
Inverse Matrix & Determinantsitutor
 
Partial differential equations
Partial differential equationsPartial differential equations
Partial differential equationsaman1894
 
systems of linear equations & matrices
systems of linear equations & matricessystems of linear equations & matrices
systems of linear equations & matricesStudent
 

Was ist angesagt? (20)

Partial Fraction
Partial FractionPartial Fraction
Partial Fraction
 
system linear equations and matrices
 system linear equations and matrices system linear equations and matrices
system linear equations and matrices
 
Complex Numbers
Complex NumbersComplex Numbers
Complex Numbers
 
1st order differential equations
1st order differential equations1st order differential equations
1st order differential equations
 
Matrices ppt
Matrices pptMatrices ppt
Matrices ppt
 
Exponential and logarithmic functions
Exponential and logarithmic functionsExponential and logarithmic functions
Exponential and logarithmic functions
 
Diagonalization of Matrices
Diagonalization of MatricesDiagonalization of Matrices
Diagonalization of Matrices
 
Ppt on matrices and Determinants
Ppt on matrices and DeterminantsPpt on matrices and Determinants
Ppt on matrices and Determinants
 
Linear algebra-Basis & Dimension
Linear algebra-Basis & DimensionLinear algebra-Basis & Dimension
Linear algebra-Basis & Dimension
 
Maths-->>Eigenvalues and eigenvectors
Maths-->>Eigenvalues and eigenvectorsMaths-->>Eigenvalues and eigenvectors
Maths-->>Eigenvalues and eigenvectors
 
Eigen values and eigenvectors
Eigen values and eigenvectorsEigen values and eigenvectors
Eigen values and eigenvectors
 
Matrix inverse
Matrix inverseMatrix inverse
Matrix inverse
 
Introduction of matrices
Introduction of matricesIntroduction of matrices
Introduction of matrices
 
Matrices and System of Linear Equations ppt
Matrices and System of Linear Equations pptMatrices and System of Linear Equations ppt
Matrices and System of Linear Equations ppt
 
Complex number
Complex numberComplex number
Complex number
 
Vector space
Vector spaceVector space
Vector space
 
MATRICES AND ITS TYPE
MATRICES AND ITS TYPEMATRICES AND ITS TYPE
MATRICES AND ITS TYPE
 
Inverse Matrix & Determinants
Inverse Matrix & DeterminantsInverse Matrix & Determinants
Inverse Matrix & Determinants
 
Partial differential equations
Partial differential equationsPartial differential equations
Partial differential equations
 
systems of linear equations & matrices
systems of linear equations & matricessystems of linear equations & matrices
systems of linear equations & matrices
 

Andere mochten auch

Eigen values and eigen vector ppt
Eigen values and eigen vector pptEigen values and eigen vector ppt
Eigen values and eigen vector pptSharada Marishka
 
Eigen value and eigen vector
Eigen value and eigen vectorEigen value and eigen vector
Eigen value and eigen vectorRutvij Patel
 
Lecture 9 eigenvalues - 5-1 & 5-2
Lecture 9   eigenvalues -  5-1 & 5-2Lecture 9   eigenvalues -  5-1 & 5-2
Lecture 9 eigenvalues - 5-1 & 5-2njit-ronbrown
 
face recognition using Principle Componet Analysis
face recognition using Principle Componet Analysisface recognition using Principle Componet Analysis
face recognition using Principle Componet AnalysisAbhilash Kotawar
 
Image recogonization
Image recogonizationImage recogonization
Image recogonizationSANTOSH RATH
 
BBMP1103 - Sept 2011 exam workshop - part 7
BBMP1103 - Sept 2011 exam workshop - part 7BBMP1103 - Sept 2011 exam workshop - part 7
BBMP1103 - Sept 2011 exam workshop - part 7Richard Ng
 
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6njit-ronbrown
 
Fast relaxation methods for the matrix exponential
Fast relaxation methods for the matrix exponential Fast relaxation methods for the matrix exponential
Fast relaxation methods for the matrix exponential David Gleich
 
Spectral clustering
Spectral clusteringSpectral clustering
Spectral clusteringSOYEON KIM
 
Solving system of equation using substitution powerpoint
Solving system of equation using substitution powerpointSolving system of equation using substitution powerpoint
Solving system of equation using substitution powerpointsyjones14
 
Sine and Cosine Fresnel Transforms
Sine and Cosine Fresnel TransformsSine and Cosine Fresnel Transforms
Sine and Cosine Fresnel TransformsCSCJournals
 
Cottle taylor expansion
Cottle taylor expansionCottle taylor expansion
Cottle taylor expansionAmit Pandey
 
Eigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theoremEigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theoremgidc engineering college
 
11.6: Sine, Cosine, Tangent
11.6: Sine, Cosine, Tangent11.6: Sine, Cosine, Tangent
11.6: Sine, Cosine, TangentJessca Lundin
 
Lesson 19: Partial Derivatives
Lesson 19: Partial DerivativesLesson 19: Partial Derivatives
Lesson 19: Partial DerivativesMatthew Leingang
 
Applied Calculus Chapter 3 partial derivatives
Applied Calculus Chapter  3 partial derivativesApplied Calculus Chapter  3 partial derivatives
Applied Calculus Chapter 3 partial derivativesJ C
 
Unit iii
Unit iiiUnit iii
Unit iiimrecedu
 

Andere mochten auch (20)

Eigen values and eigen vector ppt
Eigen values and eigen vector pptEigen values and eigen vector ppt
Eigen values and eigen vector ppt
 
Eigen value and eigen vector
Eigen value and eigen vectorEigen value and eigen vector
Eigen value and eigen vector
 
Matrix Exponential
Matrix ExponentialMatrix Exponential
Matrix Exponential
 
Lecture 9 eigenvalues - 5-1 & 5-2
Lecture 9   eigenvalues -  5-1 & 5-2Lecture 9   eigenvalues -  5-1 & 5-2
Lecture 9 eigenvalues - 5-1 & 5-2
 
Face recogntion
Face recogntionFace recogntion
Face recogntion
 
face recognition using Principle Componet Analysis
face recognition using Principle Componet Analysisface recognition using Principle Componet Analysis
face recognition using Principle Componet Analysis
 
Image recogonization
Image recogonizationImage recogonization
Image recogonization
 
BBMP1103 - Sept 2011 exam workshop - part 7
BBMP1103 - Sept 2011 exam workshop - part 7BBMP1103 - Sept 2011 exam workshop - part 7
BBMP1103 - Sept 2011 exam workshop - part 7
 
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
 
Fast relaxation methods for the matrix exponential
Fast relaxation methods for the matrix exponential Fast relaxation methods for the matrix exponential
Fast relaxation methods for the matrix exponential
 
Spectral clustering
Spectral clusteringSpectral clustering
Spectral clustering
 
Solving system of equation using substitution powerpoint
Solving system of equation using substitution powerpointSolving system of equation using substitution powerpoint
Solving system of equation using substitution powerpoint
 
Sine and Cosine Fresnel Transforms
Sine and Cosine Fresnel TransformsSine and Cosine Fresnel Transforms
Sine and Cosine Fresnel Transforms
 
Cottle taylor expansion
Cottle taylor expansionCottle taylor expansion
Cottle taylor expansion
 
Term11566
Term11566Term11566
Term11566
 
Eigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theoremEigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theorem
 
11.6: Sine, Cosine, Tangent
11.6: Sine, Cosine, Tangent11.6: Sine, Cosine, Tangent
11.6: Sine, Cosine, Tangent
 
Lesson 19: Partial Derivatives
Lesson 19: Partial DerivativesLesson 19: Partial Derivatives
Lesson 19: Partial Derivatives
 
Applied Calculus Chapter 3 partial derivatives
Applied Calculus Chapter  3 partial derivativesApplied Calculus Chapter  3 partial derivatives
Applied Calculus Chapter 3 partial derivatives
 
Unit iii
Unit iiiUnit iii
Unit iii
 

Ähnlich wie Maths

intruduction to Matrix in discrete structures.pptx
intruduction to Matrix in discrete structures.pptxintruduction to Matrix in discrete structures.pptx
intruduction to Matrix in discrete structures.pptxShaukatAliChaudhry1
 
Matrix and Determinants
Matrix and DeterminantsMatrix and Determinants
Matrix and DeterminantsAarjavPinara
 
Business mathametics and statistics b.com ii semester (2)
Business mathametics and statistics b.com ii semester (2)Business mathametics and statistics b.com ii semester (2)
Business mathametics and statistics b.com ii semester (2)shamimakamili
 
INTRODUCTION TO MATRICES, TYPES OF MATRICES,
INTRODUCTION TO MATRICES, TYPES OF MATRICES, INTRODUCTION TO MATRICES, TYPES OF MATRICES,
INTRODUCTION TO MATRICES, TYPES OF MATRICES, AMIR HASSAN
 
Matrices and Determinants
Matrices and DeterminantsMatrices and Determinants
Matrices and DeterminantsSOMASUNDARAM T
 
Applied Mathematics 3 Matrices.pdf
Applied Mathematics 3 Matrices.pdfApplied Mathematics 3 Matrices.pdf
Applied Mathematics 3 Matrices.pdfPrasadBaravkar1
 
Linear Algebra Presentation including basic of linear Algebra
Linear Algebra Presentation including basic of linear AlgebraLinear Algebra Presentation including basic of linear Algebra
Linear Algebra Presentation including basic of linear AlgebraMUHAMMADUSMAN93058
 
Matrices y determinants
Matrices y determinantsMatrices y determinants
Matrices y determinantsJeannie
 
Engg maths k notes(4)
Engg maths k notes(4)Engg maths k notes(4)
Engg maths k notes(4)Ranjay Kumar
 
Definitions matrices y determinantes fula 2010 english subir
Definitions matrices y determinantes   fula 2010  english subirDefinitions matrices y determinantes   fula 2010  english subir
Definitions matrices y determinantes fula 2010 english subirHernanFula
 
Matrix presentation By DHEERAJ KATARIA
Matrix presentation By DHEERAJ KATARIAMatrix presentation By DHEERAJ KATARIA
Matrix presentation By DHEERAJ KATARIADheeraj Kataria
 
MATRICES AND DETERMINANTS.ppt
MATRICES AND DETERMINANTS.pptMATRICES AND DETERMINANTS.ppt
MATRICES AND DETERMINANTS.ppt21EDM25Lilitha
 
eigenvalueandeigenvector72-80-160505220126 (1).pdf
eigenvalueandeigenvector72-80-160505220126 (1).pdfeigenvalueandeigenvector72-80-160505220126 (1).pdf
eigenvalueandeigenvector72-80-160505220126 (1).pdfSunny432360
 
Bba i-bm-u-2- matrix -
Bba i-bm-u-2- matrix -Bba i-bm-u-2- matrix -
Bba i-bm-u-2- matrix -Rai University
 

Ähnlich wie Maths (20)

Matrix_PPT.pptx
Matrix_PPT.pptxMatrix_PPT.pptx
Matrix_PPT.pptx
 
intruduction to Matrix in discrete structures.pptx
intruduction to Matrix in discrete structures.pptxintruduction to Matrix in discrete structures.pptx
intruduction to Matrix in discrete structures.pptx
 
Matrix_PPT.pptx
Matrix_PPT.pptxMatrix_PPT.pptx
Matrix_PPT.pptx
 
Unit i
Unit iUnit i
Unit i
 
APM.pdf
APM.pdfAPM.pdf
APM.pdf
 
Matrix and Determinants
Matrix and DeterminantsMatrix and Determinants
Matrix and Determinants
 
Business mathametics and statistics b.com ii semester (2)
Business mathametics and statistics b.com ii semester (2)Business mathametics and statistics b.com ii semester (2)
Business mathametics and statistics b.com ii semester (2)
 
INTRODUCTION TO MATRICES, TYPES OF MATRICES,
INTRODUCTION TO MATRICES, TYPES OF MATRICES, INTRODUCTION TO MATRICES, TYPES OF MATRICES,
INTRODUCTION TO MATRICES, TYPES OF MATRICES,
 
Matrices and Determinants
Matrices and DeterminantsMatrices and Determinants
Matrices and Determinants
 
Applied Mathematics 3 Matrices.pdf
Applied Mathematics 3 Matrices.pdfApplied Mathematics 3 Matrices.pdf
Applied Mathematics 3 Matrices.pdf
 
Linear Algebra Presentation including basic of linear Algebra
Linear Algebra Presentation including basic of linear AlgebraLinear Algebra Presentation including basic of linear Algebra
Linear Algebra Presentation including basic of linear Algebra
 
Matrices y determinants
Matrices y determinantsMatrices y determinants
Matrices y determinants
 
Engg maths k notes(4)
Engg maths k notes(4)Engg maths k notes(4)
Engg maths k notes(4)
 
Definitions matrices y determinantes fula 2010 english subir
Definitions matrices y determinantes   fula 2010  english subirDefinitions matrices y determinantes   fula 2010  english subir
Definitions matrices y determinantes fula 2010 english subir
 
Maths 9
Maths 9Maths 9
Maths 9
 
Matrix presentation By DHEERAJ KATARIA
Matrix presentation By DHEERAJ KATARIAMatrix presentation By DHEERAJ KATARIA
Matrix presentation By DHEERAJ KATARIA
 
MATRICES AND DETERMINANTS.ppt
MATRICES AND DETERMINANTS.pptMATRICES AND DETERMINANTS.ppt
MATRICES AND DETERMINANTS.ppt
 
eigenvalueandeigenvector72-80-160505220126 (1).pdf
eigenvalueandeigenvector72-80-160505220126 (1).pdfeigenvalueandeigenvector72-80-160505220126 (1).pdf
eigenvalueandeigenvector72-80-160505220126 (1).pdf
 
Matrices
MatricesMatrices
Matrices
 
Bba i-bm-u-2- matrix -
Bba i-bm-u-2- matrix -Bba i-bm-u-2- matrix -
Bba i-bm-u-2- matrix -
 

Kürzlich hochgeladen

Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilV3cube
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Paola De la Torre
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 

Kürzlich hochgeladen (20)

Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 

Maths

  • 1. J.Baskar Babujee Department of Mathematics Anna University, Chennai-600 025. MA6151MA6151
  • 2. CONTENT 1.1 INTRODUCTION 1.2 TYPES OF MATRICES 1.3 CHARACTERISTIC EQUATION 1.4 EIGEN VECTORS 1.5 PROBLEMS 1.6 CAYLEY HAMILTON THEOREM 2
  • 3. 1.7 DIAGONALISATION OF A MATRIX 1.8 REDUCTION OF A MATRIX TO DIAGONAL FORM 1.9 ORTHOGONAL TRANSFORMATION OF A SYMMETRIC MATRIX TO DIAGONAL FORM 1.10 QUADRATIC FORMS 1.11 NATURE OF A QUADRATIC FORM 3
  • 4. 1.12 REDUCTION OF QUADRATIC FORM TO CANONICAL FORM 1.13 INDEX AND SIGNATURE OF THE QUADRATIC FORM 1.14 LINEAR TRANSFORMATION OF A QUADRATIC FORM. 1.15 REDUCTION OF QUADRATIC FORM TO CANANONICAL FORM BY ORTHOGONAL TRANSFORMATION 4
  • 5. 1.1 INTRODUCTION:- A matrix is defined as a rectangular array (or arrangement in rows or columns) of scalar subject to certain rules of operations. If mn numbers (real or complex) or functions are arranged in the columns (vertical lines) then A is called an m n matrix. Each of the mn numbers is called an element of the matrix. × Unit 1 MATRICES
  • 6. An m n matrix is usually written as A matrix is usually denoted by a single capital letter A, B or C etc. 11 12 13 1 21 22 23 2 31 32 33 3 1 2 3 .... .... .... .... .... .... .... .... .... n n n m m m mn a a a a a a a a a a a a a a a a                ×
  • 7. 7 Thus, an m n matrix A may be written as A = , where i = 1, 2, 3, … , m ; j = 1, 2, 3, … , n In Algebra, the matrices have their largest application in the theory of simultaneous equations and linear transformations. × ija  
  • 8. E.g., The set of simultaneous equations 3333232131 2323222121 1313212111 bxaxaxa bxaxaxa bxaxaxa =++ =++ =++
  • 9. may be symbolically represented by the equation Where A = , X = , B =           333231 232221 131211 aaa aaa aaa A X = B           3 2 1 x x x           3 2 1 b b b
  • 10.       − − 720 135 1.2 TYPES OF MATRICES (1) Real Matrix :- A matrix is said to be real if all its elements are real numbers. E.g., is a real matrix.
  • 11. (2) Square Matrix:- A matrix in which the number of rows is equal to the number of columns is called a square matrix, otherwise, it is said to be rectangular matrix. i.e., a matrix A = is a square matrix if m = n a rectangular matrix if m n ij m n a ×    ≠
  • 12. A square matrix having n rows and n columns is called “ n – rowed square matrix”, is a 3 – rowed square matrix The elements of a square matrix are called its diagonal elements and the diagonal along with these elements are called principal or leading diagonal. 11 12 13 21 22 23 31 32 33 a a a a a a a a a          33,2211 aa,a
  • 13. The sum of the diagonal elements of a square matrix is called its trace or spur. Thus, trace of the n rowed square matrix A= isija   ∑ = =++++ n 1i ijnn332211 aa....aaa
  • 14. (3) Row Matrix :- A matrix having only one row and any number of columns, i.e., a matrix of order 1 n is called a row matrix. [2 5 -3 0] is a row matrix. × Example:-
  • 15. (4) Column Matrix:- A matrix having only one column and any number of rows, ii.e., a matrix of order m 1 is called a column matrix. is a column matrix.           − 1 0 2 × Example:-
  • 16. (5) Null Matrix:- A matrix in which each element is zero is called a null matrix or void matrix or zero matrix. A null matrix of order m n is denoted by = 16 × nmO × Example :-       0000 0000 42O ×
  • 17. (6) Sub - matrix :- A matrix obtained from a given matrix A by deleting some of its rows or columns or both is called a sub – matrix of A. is a sub – matrix of       41 03           − − 2461 7053 5210 Example:-
  • 18. (7) Diagonal Matrix :- A square matrix in which all non – diagonal elements are zero is called a diagonal matrix. i.e., A = [a ] is a diagonal matrix if a = 0 for i j. is a diagonal matrix. ij nn× ij ≠ Example:-           − 000 010 002
  • 19. (8) Scalar Matrix:- A diagonal matrix in which all the diagonal elements are equal to a scalar, say k, is called a scalar matrix. i.e., A = [a ] is a scalar matrix if is a scalar matrix.           200 020 002 ij nn×    = ≠ = jiwhenk jiwhen0 aij Example :-
  • 20. (9) Unit Matrix or Identity Matrix:- A scalar matrix in which each diagonal element is 1 is called a unit or identity matrix. It is denoted by . i.e., A = [a ] is a unit matrix if is a unit matrix. nI ij nn×    = ≠ = jiwhen1 jiwhen0 aij Example       10 01
  • 21. (10) Upper Triangular Matrix. A square matrix in which all the elements below the principal diagonal are zero is called an upper triangular matrix. i.e., A = [a ] is an upper triangular matrix if a = 0 for i > j is an upper triangular matrix          − 300 510 432 ij nn× ij Example:-
  • 22. (11) Lower Triangular Matrix. A square matrix in which all the elements above the principal diagonal are zero is called a lower triangular matrix. i.e., A = [a ] is a lower triangular matrix if a = 0 for i < j is a lower triangular matrix. ij nn× ij Example:-          − 023 065 001
  • 23. (12) Triangular Matrix:- A triangular matrix is either upper triangular or lower triangular. (13) Single Element Matrix:- A matrix having only one element is called a single element matrix. i.e., any matrix [3] is a single element matrix.
  • 24. (14) Equal Matrices:- Two matrices A and B are said to be equal iff they have the same order and their corresponding elements are equal. i.e., if A = and B = , then A = B iff a) m = p and n = q b) a = b for all i and j. qpij]b[ ×nmij]a[ × ij ij
  • 25. (15) Singular and Non – Singular Matrices:- A square matrix A is said to be singular if | A| = 0 and non – singular if |A| 0. A = is a singular matrix since |A| = 0.           − − − 110 432 432 ≠ Example :-
  • 26. 1.3 CHARACTERISTIC EQUATION If A is any square matrix of order n, we can form the matrix , where is the nth order unit matrix. The determinant of this matrix equated to zero, i.e., IλA − I 0 λa...aa ............ a...λaa a...aλa λA nnn2n1 2n2221 1n1211 = − − − =− I
  • 27. is called the characteristic equation of A. On expanding the determinant, we get where k’s are expressible in terms of the elements a The roots of this equation are called Characteristic roots or latent roots or eigen values of the matrix A. 0k...λkλkλ1)( n 2n 2 1n 1 nn =++++− −− ij
  • 28. 1.4 EIGEN VECTORS Consider the linear transformation Y = AX ...(1) which transforms the column vector X into the column vector Y. We often required to find those vectors X which transform into scalar multiples of themselves. Let X be such a vector which transforms into X by the transformation (1). λ
  • 29. Then Y = X ... (2) From (1) and (2), AX = X AX- X = 0 (A - )X = 0 ...(3) This matrix equation gives n homogeneous linear equations ... (4) I⇒ λ λ λ λ        =−+++ =++−+ =+++− 0λ)x(a...xaxa ................ 0xa...λ)x(axa 0xa...xaλ)x(a nnn2n21n1 n2n222121 n1n212111 ⇒ I
  • 30. These equations will have a non-trivial solution only if the co-efficient matrix A - is singular i.e., if |A - | = 0 ... (5) Corresponding to each root of (5), the homogeneous system (3) has a non-zero solution X = is called an eigen vector or latent vector λI Iλ             4 2 1 x ... x x
  • 31. 31 Properties of Eigen Values:-Properties of Eigen Values:- 1.The sum of the eigen values of a matrix is the sum of the elements of the principal diagonal. 2.The product of the eigen values of a matrix A is equal to its determinant. 3.If is an eigen value of a matrix A, then 1/ is the eigen value of A-1 . 4.If is an eigen value of an orthogonal matrix, then 1/ is also its eigen value. λ λ λ λ
  • 32. 32 PROPERTY 1:-PROPERTY 1:- If λ1, λ2,…, λn are the eigen values of A, then i. k λ1, k λ2,…,k λn are the eigen values of the matrix kA, where k is a non – zero scalar. ii. are the eigen values of the inverse matrix A-1 . iii. are the eigen values of Ap , where p is any positive integer. n21 λ 1 ,..., λ 1 , λ 1 p n p 2 p 1 λ...,,λ,λ
  • 33. 33 Proof:- i. Let λr be an eigen value of A and Xr the corresponding eigen vector. Then, by definition, Multiplying both sides by k, (kA)Xr = (kλr)Xr Then k λr is an eigen value of kA and the corresponding eigen vector is the same as that of λr, namely Xr. AXr = λrXr
  • 34. 34 ii. Pre multiplying both sides of AXr = λrXr by A-1 A-1 (A Xr) = A-1 (λr Xr ) Xr = λr (A-1 Xr ) => A-1 Xr = (Xr) Hence is an eigen value of A-1 and the corresponding eigen vector is the same as that of λr , namely Xr. rλ 1 rλ 1
  • 35. 35 iii. Pre multiplying both sides of AXr = λrXr by A A(A Xr) = A(λr Xr ) A2 Xr = λr (AXr ) = λr (λr xr) = λr 2 Xr. Similarly, we can prove that A3 Xr = λr 3 Xr, …, Ap Xr = λr p Xr, where p is any positive integer. Hence λr p is any eigen value of Ap and the corresponding eigen vector is the same as that of λr, namely Xr.
  • 36. 36 THEOREM :- A matrix A is singular if and only if 0 is an eigen value of A. 1.5 PROBLEMS 1. Find the sum and product of the eigen values of the matrix without finding the eigen values.           −− − −− = 021 612 322 A
  • 37. 37 Solution:- Sum of the eigen values of A = sum of its diagonal elements. = - 2 + 1 + 0 = -1. Product of the eigen values of A = | A | = = 45. 021 612 322 −− − −−
  • 38. 38 2. Two eigen values of the matrix are equal to 1 each. Find the third eigen value. Solution:- Let a be the third eigen value of A. Since sum of the eigen values = sum of the diagonal elements, 1 + 1 + a = 2 + 3 + 2 a = 5 Therefore, the third eigen value of A is 5.           = 221 131 122 A
  • 39. 39 3. The product of two eigen values of the matrix is 16. Find the third eigen value. Solution:- Let a be the third eigen value of A. Since product of the eigen values = | A | 16a = Therefore, a = 2.           − −− − = 312 132 226 A 312 132 226 − −− −
  • 40. 40 4. Find the sum of the eigen values of the inverse of Solution:- The eigen values of the lower triangular matrix A is 1, -3, 2. Then the eigen values of A-1 are Sum of the eigen values of A-1 = =           −= 250 032 001 A . 2 1 , 3 1 ,1 − . 2 1 3 1 1 + − + 6 7
  • 41. 41 5. If , find the eigen values of 3A, A-1 and – 2A-1 . Solution:- The eigen values of A are 2, -1, 4. The eigen values of 3A are 3×2, 3×(-1), 3×4 i.e., 6, -3, 12           −= 400 210 572 A
  • 42. 42 The eigen values of A-1 are 4 1 1,, 2 1 i.e., 4 1 , 1 1 , 2 1 − − 2 1 2,1,i.e., 4 1 21),2)((, 2 1 2 are2AofvalueseigenThe 1 −−       −−−      − − −
  • 43. 43 6. Find the eigen values and eigen vectors of the matrix A = Solution:- The characteristic equation of the given matrix is       − − 45 21 0|IA| =λ−
  • 44. or Thus, Corresponding to =6, the eigen vectors are given by (A – 6 ) X1 = 0 1,6 0652 −=λ=> =−λ−λ=> 0 45 21 = λ−− −λ− λ I 0 x x 25 25 or0 x x 645 261 or 2 1 2 1 =            −− −− =            −− −− the eigen values of A are 6, -1.
  • 45. We get only one independent equation - 5x1 - 2x2 = 0       − =∴ −= = = − =⇒ 5 2 kXarevectorseigenThe 5kx 2kx (say)k 5 x 2 x 11 12 11 1 21
  • 46. Corresponding to = -1, the eigen vectos are given by ( A + ) X2 = 0 λ I       =∴ == ==⇒ =−⇒       =            − − ⇒ 1 1 kXarevectorseigenThe kx,kx (say)k 1 x 1 x 0xx 0 0 x x 55 22 22 2221 2 21 21 2 1
  • 47. 7. Find the eigen values and eigen vectors of the matrix A = Solution:- The characteristic equation of the given matrix is           −− − −− 021 612 322 0|IA| =λ− 0 21 612 322 = λ−−− −λ− −λ−−
  • 49. Corresponding to = - 3, the eigen vectors are given by λ ( )          − +           =           − = ∴ −=== =−+ =                     −− − −− =+ 0 1 2 k 1 0 3 k k k k2k3 0x3x2 0 x x x 321 642 321 or0XI 21 1 2 21 32 3 2 1 1 1 2112213 1 X bygivenarevectorseigenThe 2k3kxthenkx,kxLet xequationtindependenoneonlygetWe 3A
  • 50. Corresponding to λ = 5, the eigen vectors are given by (A – 5 I )X2 = 0. 052 032 0327 0 0 0 521 642 327 321 321 321 3 2 1 =−−− =−− =−+−⇒           =                     −−− −− −− ⇒ xxx xxx xxx x x x
  • 52. 8. Find the eigen values and eigen vectors of the matrix Solution:- The characteristic equation is           − −− − = 342 476 268 A 0|IA| =λ− .arevalueseigenHence 3,150,λ 045λ18λλ 0 λ342 4λ76 26λ8 23 15,3,0 =⇒ =−+−⇒ = −− −−− −− ⇒
  • 53. 53 Eigen vector X1 corresponding to λ1= 0 is given by (3)...03xx42x (2)...04x7x6x (1)...02x6x8x 0 0 0 x x x 342 476 268 x x x Xwhere0,I)Xλ(A 321 321 321 3 2 1 3 2 1 111 =+− =−+− =+−⇒           =           − −− − ⇒           ==−
  • 54. 54 Solving equations (1) and (2) by cross-multiplication, we get which satisfy equation (3) also. Required eigen vector corresponding to λ1 = 0 is 131211 11 321 321 2kx,2kx,kx 0kwhere(say)k 2 x 2 x 1 x 3656 x 3212 x 1424 x === ≠===⇒ − = +− = −           =           = 2 2 1 2 2 1 1 1 1 1 k k k k X ∴
  • 55. 55 Eigen vector X2 corresponding to λ2= 3 is given by (6)...0x42x (5)...04x4x6x (4)...02x6x5x 0 0 0 x x x 042 446 265 x x x Xwhere0,I)Xλ(A 21 321 321 3 2 1 3 2 1 222 =− =−+− =+−⇒           =           − −− − ⇒           ==−
  • 56. 56 Solving equations (4) and (5) by cross- multiplication, we get which satisfy equation (6) also. Required eigen vector corresponding to λ2 = 3 is 232221 22 321 321 2kx,kx,kx 0kwhere(say)k 2- x 1 x 2 x 3620 x 2012 x 824 x −=== ≠===⇒ − = +− = − 2           − =           − = 2 1 2 2 2 2 2 2 2 2 k k k k X ∴
  • 57. 57 Eigen vector X3 corresponding to λ3= 15 is given by (9)...012xx42x (8)...04x8x-6x (7)...02x6x7x- 0 0 0 x x x 12-42 48-6 267- x x x Xwhere0,I)Xλ(A 321 321 321 3 2 1 3 2 1 333 =−− =−− =+−⇒           =           − −− − ⇒           ==−
  • 58. 58 Solving equations (7) and (8) by cross-multiplication, we get which satisfy equation (9) also. Required eigen vector corresponding to λ3 = 15 is 333231 33 321 321 kx,kx,kx 0kwhere(say)k 1 x 2- x 2 x 20 x 40- x 40 x =−== ≠===⇒ == 22           −=           −= 1 2 2 2 2 3 3 3 3 3 k k k k X
  • 59. 59 9. Find the eigen values and eigen vectors of the matrix Solution:- The characteristic equation is           − −− − = 312 132 226 A 0|IA| =λ− 8.2,2,arevalueseigenHence 82,2,λ 03236λ12λλ 0 λ312 1λ32 22λ6 23 =⇒ =+−+−⇒ = −− −−− −− ⇒
  • 60. 60 Eigen vector corresponding to is given by 221 == λλ 0xx2x 0xx2x 02x2x4x 0 0 0 x x x 112 112 224 x x x Xwhere0,I)Xλ(A 321 321 321 3 2 1 3 2 1 111 =+− =−+− =+−⇒           =           − −− − ⇒           ==−
  • 61. These equations are equivalent to a single equation … (1) Let x3 = 2 k3 and x2 = 2 k2 then from (1) 2x1 – 2 k2 + 2 k3 = 0 => x1 = k2 – k3 Required eigen vector corresponding to is 02 321 =+− xxx ∴ 21 2 λλ ==          − +           =           − = 2 0 1 0 2 1 2 2 32 3 2 32 1 kk k k kk X
  • 62. Similarly, the eigen vector corresponding to = 8 is given by, 3λ ....(4)05xx2x ....(3)0x5x2x ....(2)02x2x2x 0 0 0 x x x 512 152 222 0)X8-(A x x x Xwhere0I)Xλ-(A 321 321 321 3 2 1 3 3 2 1 333 =−− =−−− =+−−⇒           =                     −− −−− −− ⇒ =⇒           ==
  • 63. Solving equations (2) and (3) by cross-multiplication, we get which satisfy equation (4) also. Required eigen vector corresponding to λ3 = 8 is 131211 1 321 321 ,,2 )( 112 6612 kxkxkx sayk xxx xxx =−== == − =⇒ = − = ∴           −=           −= 1 1 22 1 1 1 1 3 k k k k X
  • 64. 64 Show that if λ1,λ2, … λn are the latent roots of the matrix A, then A3 has the latent roots Solution:- Let λ be a latent root of the matrix A. Then there exists a non – zero vector X such that ....,,, 33 2 3 1 nλλλ Example 1:-
  • 65. 65 A X = λ X … (1) => A2 (AX) = A2 (λ X) => A3 X = λ (A2 X) [using (1)] But A2 X = A ( A X) = A (λ X) = λ (AX) = λ (λX) = λ2 X Therefore, A3 X = λ (λ2 X) = λ3 X => λ3 is a latent root of A3.
  • 66. 66 Therefore, If λ1,λ2, … λn are the latent roots of the matrix A, then are the latent roots of A3 . If λ1,λ2, … λn are eigen values of A then find eigen values of the matrix (A – λI)2 . Solution:- (A – λI)2 = A2 – 2 λAI + λ2 I2 = A2 – 2 λA + λ2 I 33 2 3 1 ...,,, nλλλ Example 2:-
  • 67. 67 Eigen values of A2 are Eigen values of 2 λA are 2 λ λ1,2 λ λ2, …, 2 λ λn. Eigen values of λ2 I are λ2 Eigen values of ( A – λI)2 are .λ...,λ,λ 2 n 2 2 2 1 .λ)(λ,...,λ)(λ,λ)(λor .λ2λλλ,...,λ2λλλ,λ2λλλ 2 n 2 2 2 1 2 n 2 n 2 2 2 2 2 1 2 1 −−− +−+−+− ∴
  • 68. 68 Find the eigen values and eigen vectors of the matrix Solution:- The characteristic equation is           = 500 620 413 A 5.2,3,arevalueseigenHence 52,3,λ 0λ)λ)(5λ)(2 0 λ500 6λ20 41λ3 =⇒ =−−−⇒ = − − − ⇒ 3( 0|IA| =λ− Example 3:-
  • 69. 69 Eigen vector X1 corresponding to λ1= 3 is given by 0x0,x 02x 06xx 04xx 0 0 0 x x x 200 61-0 410 x x x Xwhere0,)X3(A 23 3 32 32 3 2 1 3 2 1 11 ==⇒ = =+− =+⇒           =           ⇒           ==− I
  • 70. The characteristic vector corresponding to eigen value λ1 = 3 is given by When λ2 = 2, let X2 be the eigen vector then (A – 2I) X2 = 0 where X2 = [x1 x2 x3]’ 0k,kXwhere 0 0 k x x x X 111 1 3 2 1 1 ≠=           =           =           =                     ⇒ 0 0 0 300 600 411 3 2 1 x x x
  • 72. 72 When λ3 = 5, let X3 be the eigen vector then (A – 5I) X3 = 0 where X3 = [x1 x2 x3]’ Solving above equations by cross – multiplication, we get 06x3x 04xx2x 0 0 0 x x x 000 630 412 32 321 3 2 1 =+− =++−⇒           =                     − − ⇒
  • 73. 73 Required eigen vector is 333231 33 321 321 kx,2kx,3kx 0kwhere(say)k 1 x 2 x 3 x 6 x 12 x 126 x ===⇒ ≠===⇒ == +           =           = 1 2 3 k k 2k 3k X 3 3 3 3 3 ∴
  • 74. 1.6 CAYLEY HAMILTON THEOREM Every square matrix satisfies its own characteristic equation. Let A = [aij]n×n be a square matrix then, nnnn2n1n n22221 n11211 a...aa ................ a...aa a...aa A ×             =
  • 75. Let the characteristic polynomial of A be φ (λ) Then, The characteristic equation is             11 12 1n 21 22 2n n1 n2 nn φ(λ) = A - λI a -λ a ... a a a -λ ... a = ... ... ... ... a a ... a -λ | A -λI|= 0
  • 76. Note 1:- Premultiplying equation (1) by A-1 , we have ⇒ n n-1 n-2 0 1 2 n n n-1 n-2 0 1 2 n We are to prove that pλ +p λ +p λ +...+p = 0 p A +p A +p A +...+p I= 0 ...(1) I ⇒ n-1 n-2 n-3 -1 0 1 2 n-1 n -1 n-1 n-2 n-3 0 1 2 n-1 n 0 =p A +p A +p A +...+p +p A 1 A =- [p A +p A +p A +...+p I] p
  • 77. This result gives the inverse of A in terms of (n-1) powers of A and is considered as a practical method for the computation of the inverse of the large matrices. Note 2:- If m is a positive integer such that m > n then any positive integral power Am of A is linearly expressible in terms of those of lower degree.
  • 78. Verify Cayley – Hamilton theorem for the matrix A = . Hence compute A-1 . Solution:- The characteristic equation of A is           − −− − 211 121 112 tion)simplifica(on049λ6λλor 0 λ211 1λ21 11λ2 i.e.,0λIA 23 =−+− = −− −−− −− =− Example 1:-
  • 79. To verify Cayley – Hamilton theorem, we have to show that A3 – 6A2 +9A – 4I = 0 … (1) Now,           − −− −− =           − −− −           − −− − =×=           − −− − =           − −− −           − −− − = 222121 212221 212222 211 121 112 655 565 556 655 565 556 211 121 112 211 121 112 23 2 AAA A
  • 80. A3 -6A2 +9A – 4I = 0 = - 6 + 9 -4 = This verifies Cayley – Hamilton theorem. ∴           − −− −− 222121 212221 212222           − −− − 655 565 556           − −− − 211 121 112           100 010 001 0 000 000 000 =          
  • 81. 81 Now, pre – multiplying both sides of (1) by A-1 , we have A2 – 6A +9I – 4 A-1 = 0 => 4 A-1 = A2 – 6 A +9I           − − =∴           − − =           +           − −− − −           − −− − =⇒ − − 311 131 113 4 1 311 131 113 100 010 001 9 211 121 112 6 655 565 556 4 1 1 A A
  • 82. 82 Given find Adj A by using Cayley – Hamilton theorem. Solution:- The characteristic equation of the given matrix A is           − − − = 113 110 121 A tion)simplifica(on035λ3λλor 0 λ113 1λ10 1-2λ1 i.e.,0λIA 23 =++− = −− −− − =− Example 2:-
  • 83. 83 By Cayley – Hamilton theorem, A should satisfy A3 – 3A2 + 5A + 3I = 0 Pre – multiplying by A-1 , we get A2 – 3A +5I +3A-1 = 0           − − − =           − −− −− =           − − −           − − − == +−−=⇒ 339 330 363 3A 146 223 452 113 110 121 113 110 121 A.AANow, (1)...5I)3A(A 3 1 A 2 21-
  • 86. 86 1.7 DIAGONALISATION OF A MATRIX Diagonalisation of a matrix A is the process of reduction A to a diagonal form. If A is related to D by a similarity transformation, such that D = M-1 AM then A is reduced to the diagonal matrix D through modal matrix M. D is also called spectral matrix of A.
  • 87. 87 1.8 REDUCTION OF A MATRIX TO DIAGONAL FORM If a square matrix A of order n has n linearly independent eigen vectors then a matrix B can be found such that B-1 AB is a diagonal matrix. Note:- The matrix B which diagonalises A is called the modal matrix of A and is obtained by grouping the eigen vectors of A into a square matrix.
  • 88. 88 Similarity of matrices:- A square matrix B of order n is said to be a similar to a square matrix A of order n if B = M-1 AM for some non singular matrix M. This transformation of a matrix A by a non – singular matrix M to B is called a similarity transformation. Note:- If the matrix B is similar to matrix A, then B has the same eigen values as A.
  • 89. 89 Reduce the matrix A = to diagonal form by similarity transformation. Hence find A3 . Solution:- Characteristic equation is => λ = 1, 2, 3 Hence eigen values of A are 1, 2, 3.           − − 300 120 211 0=           − − λ-300 1λ-20 21λ1- Example:-
  • 90. 90 Corresponding to λ = 1, let X1 = be the eigen vector then           3 2 1 x x x           =∴ ===∴ = =− =+−⇒           =                     − − =− 0 0 1 kX x0x,kx 02x 0xx 02xx 0 0 0 x x x 200 110 210 0X)I(A 11 3211 3 32 32 3 2 1 1
  • 91. 91 Corresponding to λ = 2, let X2 = be the eigen vector then,           3 2 1 x x x           =∴ ===∴ = =− =+−−⇒           =                     − − =− 0 1- 1 kX x-kx,kx 0x 0x 02xxx 0 0 0 x x x 100 100 211- 0X)(A 22 32221 3 3 321 3 2 1 2 0, I2
  • 92. 92 Corresponding to λ = 3, let X3 = be the eigen vector then,           3 2 1 x x x           =∴ − ===∴ =−− =+−−⇒           =                     − − =− 2 2- 3 kX xk-x,kx 0x 02xxx 0 0 0 x x x 000 11-0 212- 0X)(A 33 13332 3 321 3 2 1 3 3 2 2 3 , 2 I3 k x
  • 93. 93 Hence modal matrix is               −==∴           − = −=           −= − 2 1 00 11-0 2 1- 11 M MAdj. M 1-00 220 122- MAdj. 2M 200 21-0 311 M 1
  • 94. 94 Now, since D = M-1 AM => A = MDM-1 A2 = (MDM-1 ) (MDM-1 ) = MD2 M-1 [since M-1 M = I]           =           −           − −               − − =− 300 020 001 200 21-0 311 300 120 211 2 1 00 11-0 2 1 11 AMM 1
  • 96. 96 1.9 ORTHOGONAL TRANSFORMATION OF A SYMMETRIC MATRIX TO DIAGONAL FORM A square matrix A with real elements is said to be orthogonal if AA’ = I = A’A. But AA-1 = I = A-1 A, it follows that A is orthogonal if A’ = A-1 . Diagonalisation by orthogonal transformation is possible only for a real symmetric matrix.
  • 97. 97 If A is a real symmetric matrix then eigen vectors of A will be not only linearly independent but also pairwise orthogonal. If we normalise each eigen vector and use them to form the normalised modal matrix N then it can be proved that N is an orthogonal matrix.
  • 98. 98 The similarity transformation M-1 AM = D takes the form N’AN = D since N-1 = N’ by a property of orthogonal matrix. Transforming A into D by means of the transformation N’AN = D is called as orthogonal reduction or othogonal transformation. Note:- To normalise eigen vector Xr, divide each element of Xr, by the square root of the sum of the squares of all the elements of Xr.
  • 99. 99 Diagonalise the matrix A = by means of an orthogonal transformation. Solution:- Characteristic equation of A is 204 060 402 66,2,λ 0λ)16(6λ)λ)(2λ)(6(2 0 λ204 0λ60 40λ2 −=⇒ =−−−−−⇒ = − − − Example :-
  • 100. 100 I                                       ⇒ ∴    ∴      1 1 2 3 1 1 2 3 1 3 2 1 3 1 1 2 3 1 1 1 x whenλ = -2,let X = x be theeigenvector x then (A + 2 )X = 0 4 0 4 x 0 0 8 0 x = 0 4 0 4 x 0 4x + 4x = 0 ...(1) 8x = 0 ...(2) 4x + 4x = 0 ...(3) x = k ,x = 0,x = -k 1 X = k 0 -1
  • 101. 101 2 2I 0                                       ⇒ − ∴ 1 2 3 1 2 3 1 3 1 3 1 3 2 2 2 3 x whenλ = 6,let X = x betheeigenvector x then (A -6 )X = 0 -4 0 4 x 0 0 0 x = 0 4 0 -4 x 0 4x +4x = 0 4x - 4x = 0 x = x and x isarbitrary x must be so chosen that X and X are orthogonal among th .1 emselves and also each is orthogonal with X
  • 102. 102                    ∴ ∴          ∴ 2 3 3 1 3 2 3 1α Let X = 0 and let X =β 1γ Since X is orthogonal to X α - γ = 0 ...(4) X is orthogonal to X α + γ = 0 ...(5) Solving (4)and(5), we getα = γ = 0 and β is arbitra ry. 0 Takingβ =1, X = 1 0 1 1 0 Modal matrix is M = 0 0 1 -1 1         0
  • 103. 103                                                           The normalised modal matrix is 1 1 0 2 2 N = 0 0 1 1 1 - 0 2 2 1 1 0 - 1 1 02 2 2 0 4 2 2 1 1 D =N'AN = 0 0 6 0 0 0 1 2 2 4 0 2 1 1 - 00 1 0 2 2 -2 0 0 D = 0 6 0 which is the required diagonal matrix 0 0 6 .
  • 104. 104 1.10 QUADRATIC FORMS DEFINITION:-DEFINITION:- A homogeneous polynomial of second degree in any number of variables is called a quadratic form. For example, ax2 + 2hxy +by2 ax2 + by2 + cz2 + 2hxy + 2gyz + 2fzx and ax2 + by2 + cz2 + dw2 +2hxy +2gyz + 2fzx + 2lxw + 2myw + 2nzw are quadratic forms in two, three and four variables.
  • 105. 105 In n – variables x1,x2,…,xn, the general quadratic form is In the expansion, the co-efficient of xixj = (bij + bji). ∑∑= = ≠ n 1j n 1i jiijjiij bbwhere,xxb ).b(b 2 1 awherexxaxxb baandaawherebb2aSuppose jiijijji n 1j n 1i ijji n 1j n 1i ij iiiijiijijijij +== ==+= ∑∑∑∑ = == =
  • 106. 106 Hence every quadratic form can be written as ( ) ( ) getweform,matrixin formsquadraticofexamplessaidabovethewritingNow .x,...,x,xXandaAwhere symmetric,alwaysisAmatrixthethatso AX,X'xxa n21ij ji n 1j n 1i ij == =∑∑= =             =++ y x bh ha y][xby2hxyax(i) 22
  • 108. 108 1.11 NATURE OF A QUADRATIC FORM A real quadratic form X’AX in n variables is said to be i. Positive definite if all the eigen values of A > 0. ii. Negative definite if all the eigen values of A < 0. iii. Positive semidefinite if all the eigen values of A 0 and at least one eigen value = 0. iv. Negative semidefinite if all the eigen values of A 0 and at least one eigen value = 0. v. Indefinite if some of the eigen values of A are + ve and others – ve. ≥ ≤
  • 109. 109 Find the nature of the following quadratic forms i. x2 + 5y2 + z2 + 2xy + 2yz + 6zx ii. 3x2 + 5y2 + 3z2 – 2yz + 2zx – 2xy Solution:- i. The matrix of the quadratic form is           = 113 151 311 A Example :-
  • 110. 110 The eigen values of A are -2, 3, 6. Two of these eigen values being positive and one being negative, the given quadratric form is indefinite. ii. The matrix of the quadratic form is The eigen values of A are 2, 3, 6. All these eigen values being positive, the given quadratic form is positive definite.           − −− − = 311 151 113 A
  • 111. 111 1.12 REDUCTION OF QUADRATIC FORM TO CANONICAL FORM A homogeneous expression of the second degree in any number of variables is called a quadratic form. For instance, if which is a quadratic form. (i)....2hxy2gzx2fyzczbyaxAXX'then ],zyx[X'and z y x X, cfg fbh gha A 222 +++++= =           =           =
  • 112. 112 Let λ1, λ2, λ3 be the eigen values of the matrix A and be its corresponding eigen vectors in the normalized form (i.e., each element is divided by square root of sum of the squares of all the three elements in the eigen vector).           =           =           = 3 3 3 3 2 2 2 2 1 1 1 1 z y x X, z y x X, z y x X
  • 113. 113 Then B-1 AB = D, a diagonal matrix. Hence the quadratic form (i) is reduced to a sum of squares (i.e., canonical form). λ1x2 + λ2y2 + λ3z2 And B is the matrix of transformation which is an orthogonal matrix. Note:- 1. Here some of λi may be positive or negative or zero 2. If ρ(A) = r, then the quadratic form X’AX will contain only r terms.
  • 114. 114 1.13 INDEX AND SIGNATURE OF THE QUADRATIC FORM The number p of positive terms in the canonical form is called the index of the quadratic form. (The number of positive terms) – ( the number of negative terms) i.e., p – (r – p) = 2p – r is called signature of the quadratic form, where ρ(A) = r.
  • 115. 115 1.14 LINEAR TRANSFORMATION OF A QUADRATIC FORM. Let X’AX be a quadratic form in n- variables and let X = PY ….. (1) where P is a non – singular matrix, be the non – singular transformation. From (1), X’ = (PY)’ = Y’P’ and hence X’AX = Y’P’APY = Y’(P’AP)Y = Y’BY …. (2) where B = P’AP.
  • 116. 116 Therefore, Y’BY is also a quadratic form in n- variables. Hence it is a linear transformation of the quadratic form X’AX under the linear transformation X = PY and B = P’AP. Note. (i) Here B = (P’AP)’ = P’AP = B (ii) ρ(B) = ρ(A) Therefore, A and B are congruent matrices.
  • 117. 117 Reduce 3x2 + 3z2 + 4xy + 8xz + 8yz into canonical form. Or Diagonalise the quadratic form 3x2 + 3z2 + 4xy + 8xz + 8yz by linear transformations and write the linear transformation. Or Reduce the quadratic form 3x2 + 3z2 + 4xy + 8xz + 8yz into the sum of squares. Example:-
  • 118. 118 Solution:- The given quadratic form can be written as X’AX where X = [x, y, z]’ and the symmetric matrix A = Let us reduce A into diagonal matrix. We know tat A = I3AI3.           344 402 423                               =           100 010 001 344 402 423 100 010 001 344 402 423
  • 119. 119 − − −                  − = −                − −     21 31OperatingR ( 2 / 3),R ( 4 / 3) (for A onL.H.S.andpre factor on R.H.S.),weget 3 2 4 1 0 0 1 0 0 4 4 2 0 1 0 A 0 1 0 3 3 3 0 0 1 4 7 4 0 0 1 3 3 3               −−                 − −=                 − − − −− 100 010 3 4 3 2 1 A 10 3 4 01 3 2 001 3 7 3 4 0 3 4 3 4 0 423 getweR.H.S),onfactorpostandL.H.S.onA(for 4/3)(C2/3),(COperating 3121
  • 121. 121 The canonical form of the given quadratic form is Here ρ(A) = 3, index = 1, signature = 1 – (2) = -1. Note:- In this problem the non-singular transformation which reduces the given quadratic form into the canonical form is X = PY. i.e.,                     − − =           3 2 1 112 01 3 2 001 y y y z y x [ ] 2 3 2 2 2 1 3 2 1 321 yy 3 4 3y y y y 100 0 3 4 0 003 yyyAP)Y(P'Y' −−=                     − − =
  • 122. 122 1.15 REDUCTION OF QUADRATIC FORM TO CANANONICAL FORM BY ORTHOGONAL TRANSFORMATION Let X’AX be a given quadratic form. The modal matrix B of A is the matrix whose columns are characteristic vectors of A. If B represents the orthogonal matrix of A,
  • 123. 123 then X = BY will reduce X’AX to Y’ diag(λ1, λ2,…, λn) Y, where λ1, λ2,…, λn are characteristic values of A. Note. This method works successfully if the characteristic vectors of A are linearly independent which are pairwise orthogonal.
  • 124. 124 Reduce 8x2 + 7y2 + 3z2 – 12xy + 4xz – 8yz into canonical form by orthogonal reduction. Solution:- The matrix of the quadratic form is           − −− − = 342 476 268 A Example 1:-
  • 125. 125 The characteristic roots of A are given by 0|| =− IλA 153,0,λ 015)3)(λλ(λor 0 λ342 4λ76 26λ8 =∴ =−− = −− −−− −− .,.ei
  • 126. 126 Characteristic vector for λ = 0 is given by [A – (0)I] X1 = 0 ' 11 321 321 321 321 2)2,(1,kXvectoreigenthegiving 2 x 2 x 1 x getwe,twofirstSolving 03x4x2x 04x7x6x 02x6x8xi.e., = == =+− =−+− =+−
  • 127. 127 When λ = 3, the corresponding characteristic vector is given by [A – 3I] X2 = 0 i.e., Solving any two equations, we get X2 = k2 (2, 1, -2)’. Similarly characteristic vector corresponding to λ = 15 is X3 = k3 (2, -2, 1)’. 04x2x 04x4x6x 02x6x5x 21 321 321 =− =−+− =+−
  • 128. 128 Now, X1, X2, X3 are pairwise orthogonal i.e., X1 . X2 = X2 . X3 = X3 . X1 = 0. The normalised modal matrix is                 − −= 3 1 3 2 3 2 3 2 3 2 3 1 3 2 3 1 3 2 B ∴
  • 129. 129 Now B is orthogonal matrix and 1B =           =                 − −                 − − === −− 1500 000 003 3 1 3 2 3 2 3 2 3 2 3 1 3 2 3 1 3 2 A 3 1 3 2 3 2 3 2 3 2 3 1 3 2 3 1 3 2 ie., 15}0,{3,diagDABBandBBi.e., 1T1
  • 130. 130 which is the required canonical form. Note. Here the orthogonal transformation is X =BY, rank of the quadratic form = 2; index = 2, signature = 2. It is positive definite. [ ] 2 3 2 2 2 1 3 2 1 321 1 15y0.y3y y y y 1500 000 003 yyy DYY'AB)Y(BY'AXX' ++=                     = == −
  • 131. 131 TEST YOUR KNOWLEDGE 1. If then find the eigen value of 2. Write the matrix of the Quadratic form 3. Obtain the characteristic equation of the matrix whose eigen values are 1,-2 and 3. 4. A = , then find the eigen values of 3A3 +5A2 -6A+2I. 1 2 3 0 3 2 0 0 2 A −   ÷ =  ÷  ÷−  ( ) 2 3A I− 1 2 3 1 2 32 2 2x x x x x x+ − .           − 200 320 061
  • 132. 132 5. Write the quadratic form corresponding to the following symmetric matrix 6. Find the sum of the eigen values of the inverse of. 7. Obtain the latent roots of A4 where A = 8. If A is idempotent matrix then A2 =A. What will be the eigen values of A.           − − 342 411 210 1 0 0 2 3 0 0 5 2    ÷ − ÷  ÷   5 4 1 2    ÷  
  • 133. 133 9. If are the eigen values of the matrix A, whose characteristic equation is Obtain using the property. 10. (i) Using Cayley-Hamilton Theorem find the inverse of the matrix (ii) Find the Characteristic roots and Characteristic vectors of the matrix 1 2 2 3 3 1λ λ λ λ λ λ+ + 3 2 21 45 0λ λ λ+ − − = 1 2 3, andλ λ λ 1 0 3 2 1 1 1 1 1 A    ÷ = − ÷  ÷−  1 1 3 1 5 1 3 1 1 A    ÷ = ÷  ÷  
  • 134. 134 11. Reduce the quadratic form to the canonical form by orthogonal transformation. Also specify the matrix of transformation. Obtain its index, signature and nature of the quadratic form. 12. (i) Find the eigen value and eigen vector of the matrix (ii) Using Cayley Hamiltonian find the inverse of 2 2 2 5 2 2 6x y z xy yz xz+ + + + +           −− −− −− 6410 527 7411           − − 111 112 301
  • 135. 135 13. Discuss the nature, index and signature of the quadratic form 14. Diagonalise the matrix by orthogonal reduction and provide the normalized modal matrix. 15. Reduce the quadratic form 2x1x2+2x1x3-2x2x3 to the canonical form by an Orthogonal transformation. 2 2 2 10 2 5 6 10 4x y z yz zx xy+ + + − − 8 6 2 6 7 4 2 4 3 A −   ÷ = − − ÷  ÷− 
  • 136. 136 16. (i) Find the eigen value and eigen vector of the matrix (ii) For A = , compute the value of , Using Cayley- Hamilton theorem. 17. Reduce the quadratic form 3x1 2 + 5x2 2 + 3x3 2 -2x2x3+2x3x1-2x1x2 to a Canonical form by orthogonal reduction. 2 1 0 0 2 1 0 0 2    ÷  ÷  ÷             − − 111 112 301 6 5 4 3 2 5 8 2 9 31 36A A A A A A I− + − − + −