SlideShare ist ein Scribd-Unternehmen logo
1 von 42
6
6.1
© 2012 Pearson Education, Inc.
Orthogonality
INNER PRODUCT, LENGTH,
AND ORTHOGONALITY
Slide 6.1- 2© 2012 Pearson Education, Inc.
INNER PRODUCT
 If u and v are vectors in Rn, then we regard u
and v as nx1 matrices
 The number uTv is called the inner product of
u and v, and it is written as u∙v
 aka dot product
 Scalar
Slide 6.1- 3© 2012 Pearson Education, Inc.
INNER PRODUCT
If 𝐮 =
𝑢1
𝑢2
⋮
𝑢 𝑛
and 𝐯 =
𝑣1
𝑣2
⋮
𝑣 𝑛
then the inner product of u and v is
𝑢1 𝑢2 ⋯ 𝑢 𝑛
𝑣1
𝑣2
⋮
𝑣 𝑛
= 𝑢1 𝑣1 + 𝑢2 𝑣2 + ⋯ + 𝑢 𝑛 𝑣 𝑛
Inner Product - Example
 Find u∙v & v∙u, for:
𝐮 =
2
−5
−1
, 𝐯 =
3
2
3
Slide 6.1- 4© 2012 Pearson Education, Inc.
INNER PRODUCT - Properties
Theorem 1: Let u, v, and w be vectors in Rn, and let c
be a scalar. Then
a. u•v = v•u
b. (u + v)•w = u•w + v•w
c. (cu)•v = c(u•v) = u•(cv)
d. u•u ≥0, and =0 iff u = 0
 Properties (b) and (c) can be combined several
times to produce the following useful rule:
(c1u1 + c2u2 + … + cnun)•v = c1(u1•v) + … + cp(up•v)
Slide 6.1- 6© 2012 Pearson Education, Inc.
THE LENGTH OF A VECTOR
Definition: The length (or norm) of v:
𝐯 = 𝐯 ∙ 𝐯 = 𝑣1
2 + 𝑣2
2 + ⋯ + 𝑣 𝑛
2
 ||cv|| = |c| ||v||
 Unit vector has length 1
 Any vector divided by its length is a unit vector in
same direction  normalizing
Slide 6.1- 7© 2012 Pearson Education, Inc.
Normalization – Example 1
Example 1: Let v = (1, -2, 2, 0). Find a unit vector
u in the same direction as v.
Slide 6.1- 8© 2012 Pearson Education, Inc.
Normalization – Example 2
Example 2: Let W be a subspace of R2 spanned
by x =
2/3
1
. Find a unit vector z that is a basis for
W.
Slide 6.1- 9© 2012 Pearson Education, Inc.
DISTANCE IN Rn
Definition: For u and v in Rn, the distance between u
and v, written as dist (u, v), is the length of the vector
u-v. i.e.,
dist(u, v) = ||u – v||
Example: Find the distance between u=(7,1) & v=(3,2)
Orthogonal Vectors
Definition: 2 vectors u and v are orthogonal
if u•v = 0.
Theorem 6-2: Pythagorean Theorem: 2
vectors u and v are orthogonal iff
𝐮 + 𝐯 2
= 𝐮 2
+ 𝐯 2
Slide 6.1- 10© 2012 Pearson Education, Inc.
Slide 6.1- 11© 2012 Pearson Education, Inc.
Orthogonal Complement
 If a vector z is orthogonal to every vector in a
subspace W of Rn, then z is said to be
orthogonal to W.
 The set of all vectors z that are orthogonal to W
is called the orthogonal complement of W and
is denoted by W┴ (and read as “W
perpendicular” or simply “W perp”).
ORTHOGONAL COMPLEMENTS
Theorem 3: Let A be an mxn matrix. The orthogonal
complement of the row space of A is the null space of
A, and the orthogonal complement of the column
space of A is the null space of AT:
Proof: ai = ith row of A
Row(A) = Span{rows of A}
Nul(A) = x:Ax = 0
ai • x = 0 for every i
x is orthogonal to each row of A
(Row ) NulA A

(Col ) Nul T
A A

Slide 6.1- 13© 2012 Pearson Education, Inc.
ANGLES & Inner Products
u•v = ||u|| ||v|| cosϑ
Slide 6.1- 14© 2012 Pearson Education, Inc.
ANGLES & Inner Products - Example
Find the angle between
1
1
0
,
0
1
1
6
6.1
© 2012 Pearson Education, Inc.
Orthogonal Sets
Slide 6.2- 16© 2012 Pearson Education, Inc.
ORTHOGONAL SETS
 A set of vectors {u1,…,up} in Rn is said to be an
orthogonal set if each pair of distinct vectors
from the set is orthogonal, i.e, if
ui •uj = 0 if i ≠ j.
Orthogonal Sets - Example
Show that S={u1,u2,u3} is an orthogonal set.
𝐮1 =
3
1
1
, 𝐮2 =
−1
2
1
, 𝐮1 =
−1/2
−2
7/2
Slide 6.1- 17© 2012 Pearson Education, Inc.
Theorem 6-4 – Orthogonal Set as a Basis
Theorem 4: If S = {u1, …, up} is an
orthogonal set of nonzero vectors in Rn ,
then S is linearly independent and hence is
a basis for the subspace spanned by S.
Slide 6.1- 18© 2012 Pearson Education, Inc.
Slide 6.2- 19© 2012 Pearson Education, Inc.
ORTHOGONAL Basis
Definition: An orthogonal basis for a subspace W
of Rn is a basis for W that is also an orthogonal set.
Theorem 5: Let {u1,…,up} be an orthogonal basis for
a subspace W of Rn. For each y in W, the weights in
the linear combination
are given by y = c1u1+ … + cpup
𝑐𝑗 =
𝐲∙𝐮 𝒋
𝐮 𝒋∙𝐮 𝒋
(𝑗 = 1, … , 𝑝)
Orthogonal Basis – Example 2
Same S as in Example 1:
𝐮1 =
3
1
1
, 𝐮2 =
−1
2
1
, 𝐮1 =
−1/2
−2
7/2
write 𝐲 =
6
1
−8
as a linear combination of the
vectors in S.
Slide 6.1- 20© 2012 Pearson Education, Inc.
Slide 6.2- 21© 2012 Pearson Education, Inc.
AN ORTHOGONAL PROJECTION
Decompose y into 2 vectors:
 One in direction of u  ŷ = αu
 One orthogonal to u  z = y - ŷ
Orthogonal Projections
 ŷ is the projection of y onto u
 If L is subspace spanned by u
 ŷ is the projection of y onto L
𝐲 = proj 𝐿 𝐲 =
𝐲 ∙ 𝐮
𝐮 ∙ 𝐮
𝐮
Slide 6.1- 22© 2012 Pearson Education, Inc.
Slide 6.2- 23© 2012 Pearson Education, Inc.
ORTHOGONAL PROJECTION - Example
Example 1: 𝐲 =
7
6
, 𝐮 =
4
2
a) Find the orthogonal projection of y onto u.
b) Write y as the sum of two orthogonal vectors,
one in Span {u} and one orthogonal to u.
ORTHOGONAL PROJECTION - Example
7 8 1
6 4 2
     
      
     
ˆyy ˆ(y y)
Slide 6.2- 25© 2012 Pearson Education, Inc.
AN ORTHOGONAL PROJECTION
 Check: ŷ•z = 0?
 Distance from y to L?
 Distance from y to u?
Slide 6.2- 26© 2012 Pearson Education, Inc.
ORTHONORMAL SETS
 A set {u1,…,up} is an orthonormal set if it is an
orthogonal set of unit vectors (||u||=1).
 If W is the subspace spanned by such a set, then
{u1,…,up} is an orthonormal basis for W, since
the set is automatically linearly independent, by
Theorem 4.
 The simplest example of an orthonormal set is the
standard basis {e1,…,en} for Rn.
Slide 6.2- 27© 2012 Pearson Education, Inc.
ORTHONORMAL SETS - Example
Example 2: Show that {v1, v2, v3} is an
orthonormal basis of R3, where
1
3/ 11
v 1/ 11
1/ 11
 
 
  
 
  
2
1/ 6
v 2 / 6
1/ 6
 
 
  
 
  
3
1/ 66
v 4 / 66
7 / 66
 
 
  
 
  
Slide 6.2- 28© 2012 Pearson Education, Inc.
ORTHONORMAL SETS
 When the vectors in an orthogonal set of
nonzero vectors are normalized to have unit
length, the new vectors will still be orthogonal,
and hence the new set will be an orthonormal
set.
Slide 6.2- 29© 2012 Pearson Education, Inc.
ORTHONORMAL SETS
Theorem 6: An mxn matrix U has orthonormal
columns if and only if UTU = I
Slide 6.2- 30© 2012 Pearson Education, Inc.
ORTHONORMAL SETS
Theorem 7: Let U be an mxn matrix with
orthonormal columns, and let x and y be in Rn.
Then
a. ||Ux|| = ||x||
b. (Ux)•(Uy) = x•y
c. (Ux)•(Uy) = 0 iff x•y=0
 Properties (a) and (c) say that the linear
mapping x Ux preserves lengths and
orthogonality.
Orthonormal Sets - Example
𝑈 =
1 2 2 3
1 2 2/3
0 1/3
, 𝐱 = 2
3
Slide 6.1- 31© 2012 Pearson Education, Inc.
Orthogonal Matrix
 If U is square and U-1=UT
 Called “Orthogonal Matrix”
 Really should be Orthonormal matrix
 Orthonormal columns
 Orthonormal rows as well
Slide 6.1- 32© 2012 Pearson Education, Inc.
6
6.1
© 2012 Pearson Education, Inc.
Orthogonal Projections
Slide 6.3- 34© 2012 Pearson Education, Inc.
ORTHOGONAL PROJECTIONS
 Extend ideas of previous section from R2 to Rn
 Given a vector y and a subspace W in Rn, there
is a vector ŷ in W such that
 ŷ is the unique vector in W for which y - ŷ is
orthogonal to W
 ŷ is the unique vector in W closest to y. See the
following figure.
ˆy
Slide 6.3- 35© 2012 Pearson Education, Inc.
THE ORTHOGONAL DECOMPOSITION
THEOREM
Theorem 8: Let W be a subspace of Rn . Then
each y in Rn can be written uniquely in the form
y = ŷ + z
where ŷ is in W and z is in W┴ .
If {u1,…,up} is any orthogonal basis of W, then
𝐲 =
𝐲 ∙ 𝐮1
𝐮1 ∙ 𝐮1
𝐮1 + ⋯ +
𝐲 ∙ 𝐮 𝑝
𝐮 𝑝 ∙ 𝐮 𝑝
𝐮 𝑝
and
z = y - ŷ
Slide 6.3- 36© 2012 Pearson Education, Inc.
THE ORTHOGONAL DECOMPOSITION
THEOREM
 The vector ŷ is called the orthogonal
projection of y onto W and often is written as
projWy. See the following figure.
Slide 6.3- 37© 2012 Pearson Education, Inc.
THE ORTHOGONAL DECOMPOSITION
THEOREM - Example
Example 1: Write y as the sum of a vector in W =
span{u1,u2}, and a vector orthogonal to W.
𝐮1 =
2
5
−1
, 𝐮2 =
−2
1
1
, 𝐲 =
1
2
3
Slide 6.3- 38© 2012 Pearson Education, Inc.
PROPERTIES OF ORTHOGONAL
PROJECTIONS
 If {u1,…,up} is an orthogonal basis for W and if y
happens to be in W, then the formula for projWy
is exactly the same as the representation of y
given in Theorem 5 in Section 6.2.
 If y is in W = Span {u1,…,up}, projwy = y
Slide 6.3- 39© 2012 Pearson Education, Inc.
THE BEST APPROXIMATION THEOREM
Theorem 9: Let W be a subspace of Rn, let y be any
vector in Rn, and let ŷ be the orthogonal projection of
y onto W. Then ŷ is the closest point in W to y, in the
sense that
||y – ŷ|| < ||y – v||
for all v in W distinct from ŷ.
The Best Approximation Theorem
 ŷ in Theorem 9 is called the best
approximation to y by elements of W.
 The distance from y to v, given by ||y – v||,
can be regarded as the “error” of using v in
place of y.
 Theorem 9 says that this error is minimized
when v=ŷ
 Note – does not depend of basis of W used
Slide 6.1- 40© 2012 Pearson Education, Inc.
Slide 6.3- 41© 2012 Pearson Education, Inc.
THE BEST APPROXIMATION THEOREM
Slide 6.3- 42© 2012 Pearson Education, Inc.
Best Approximation - Example
Example 2: Find the distance from y to W = Span
{u1,u2} for:
𝐮1 =
5
−2
1
, 𝐮2 =
1
2
−1
, 𝐲 =
−1
−5
10

Weitere ähnliche Inhalte

Was ist angesagt?

Integration presentation
Integration presentationIntegration presentation
Integration presentationUrmila Bhardwaj
 
Vector space - subspace By Jatin Dhola
Vector space - subspace By Jatin DholaVector space - subspace By Jatin Dhola
Vector space - subspace By Jatin DholaJatin Dhola
 
First Order Differential Equations
First Order Differential EquationsFirst Order Differential Equations
First Order Differential EquationsItishree Dash
 
Method of direct proof
Method of direct proofMethod of direct proof
Method of direct proofAbdur Rehman
 
ppt on Vector spaces (VCLA) by dhrumil patel and harshid panchal
ppt on Vector spaces (VCLA) by dhrumil patel and harshid panchalppt on Vector spaces (VCLA) by dhrumil patel and harshid panchal
ppt on Vector spaces (VCLA) by dhrumil patel and harshid panchalharshid panchal
 
Limits in Maths presentation
Limits in Maths presentationLimits in Maths presentation
Limits in Maths presentation3130370505746
 
Methods of variation of parameters- advance engineering mathe mathematics
Methods of variation of parameters- advance engineering mathe mathematicsMethods of variation of parameters- advance engineering mathe mathematics
Methods of variation of parameters- advance engineering mathe mathematicsKaushal Patel
 
Independence, basis and dimension
Independence, basis and dimensionIndependence, basis and dimension
Independence, basis and dimensionATUL KUMAR YADAV
 
Three dimensional geometry
Three dimensional geometryThree dimensional geometry
Three dimensional geometrynitishguptamaps
 
Chapter 5 Graphs (1).ppt
Chapter 5 Graphs (1).pptChapter 5 Graphs (1).ppt
Chapter 5 Graphs (1).pptishan743441
 
Increasing and decreasing functions ap calc sec 3.3
Increasing and decreasing functions ap calc sec 3.3Increasing and decreasing functions ap calc sec 3.3
Increasing and decreasing functions ap calc sec 3.3Ron Eick
 
Complex numbers and quadratic equations
Complex numbers and quadratic equationsComplex numbers and quadratic equations
Complex numbers and quadratic equationsriyadutta1996
 

Was ist angesagt? (20)

Integration presentation
Integration presentationIntegration presentation
Integration presentation
 
Vector space - subspace By Jatin Dhola
Vector space - subspace By Jatin DholaVector space - subspace By Jatin Dhola
Vector space - subspace By Jatin Dhola
 
Ordinary differential equation
Ordinary differential equationOrdinary differential equation
Ordinary differential equation
 
First Order Differential Equations
First Order Differential EquationsFirst Order Differential Equations
First Order Differential Equations
 
Method of direct proof
Method of direct proofMethod of direct proof
Method of direct proof
 
Legendre functions
Legendre functionsLegendre functions
Legendre functions
 
Vector Spaces
Vector SpacesVector Spaces
Vector Spaces
 
ppt on Vector spaces (VCLA) by dhrumil patel and harshid panchal
ppt on Vector spaces (VCLA) by dhrumil patel and harshid panchalppt on Vector spaces (VCLA) by dhrumil patel and harshid panchal
ppt on Vector spaces (VCLA) by dhrumil patel and harshid panchal
 
Mean Value Theorems
Mean Value TheoremsMean Value Theorems
Mean Value Theorems
 
Limits in Maths presentation
Limits in Maths presentationLimits in Maths presentation
Limits in Maths presentation
 
Theory of Equation
Theory of EquationTheory of Equation
Theory of Equation
 
Methods of variation of parameters- advance engineering mathe mathematics
Methods of variation of parameters- advance engineering mathe mathematicsMethods of variation of parameters- advance engineering mathe mathematics
Methods of variation of parameters- advance engineering mathe mathematics
 
Vector spaces
Vector spaces Vector spaces
Vector spaces
 
Independence, basis and dimension
Independence, basis and dimensionIndependence, basis and dimension
Independence, basis and dimension
 
Three dimensional geometry
Three dimensional geometryThree dimensional geometry
Three dimensional geometry
 
MEAN VALUE THEOREM
MEAN VALUE THEOREMMEAN VALUE THEOREM
MEAN VALUE THEOREM
 
Vector space
Vector spaceVector space
Vector space
 
Chapter 5 Graphs (1).ppt
Chapter 5 Graphs (1).pptChapter 5 Graphs (1).ppt
Chapter 5 Graphs (1).ppt
 
Increasing and decreasing functions ap calc sec 3.3
Increasing and decreasing functions ap calc sec 3.3Increasing and decreasing functions ap calc sec 3.3
Increasing and decreasing functions ap calc sec 3.3
 
Complex numbers and quadratic equations
Complex numbers and quadratic equationsComplex numbers and quadratic equations
Complex numbers and quadratic equations
 

Andere mochten auch

engineering statics: distributed forces-1
engineering statics: distributed forces-1engineering statics: distributed forces-1
engineering statics: distributed forces-1musadoto
 
Tire forces and moments
Tire forces and momentsTire forces and moments
Tire forces and momentsPankaj Das
 
ORTHOGONAL, ORTHONORMAL VECTOR, GRAM SCHMIDT PROCESS, ORTHOGONALLY DIAGONALI...
ORTHOGONAL, ORTHONORMAL  VECTOR, GRAM SCHMIDT PROCESS, ORTHOGONALLY DIAGONALI...ORTHOGONAL, ORTHONORMAL  VECTOR, GRAM SCHMIDT PROCESS, ORTHOGONALLY DIAGONALI...
ORTHOGONAL, ORTHONORMAL VECTOR, GRAM SCHMIDT PROCESS, ORTHOGONALLY DIAGONALI...Smit Shah
 
Steering system forces and moments
Steering system forces and momentsSteering system forces and moments
Steering system forces and momentssaffrony09
 
All types of Tires by Ankush Agrawal
All types of Tires by Ankush Agrawal All types of Tires by Ankush Agrawal
All types of Tires by Ankush Agrawal Ankush Agrawal
 
Vehicle dynamics - Chapter 2 (Road Loads)
Vehicle dynamics - Chapter 2 (Road Loads)Vehicle dynamics - Chapter 2 (Road Loads)
Vehicle dynamics - Chapter 2 (Road Loads)Nizam Anuar
 
Mechanical Vibrations all slides
Mechanical Vibrations all slidesMechanical Vibrations all slides
Mechanical Vibrations all slidesEbrahim Hanash
 
AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017
AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017
AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017Carol Smith
 

Andere mochten auch (17)

Types of tyres
Types of tyresTypes of tyres
Types of tyres
 
Tires
TiresTires
Tires
 
engineering statics: distributed forces-1
engineering statics: distributed forces-1engineering statics: distributed forces-1
engineering statics: distributed forces-1
 
3 tyre types
3 tyre types3 tyre types
3 tyre types
 
Steering system
Steering systemSteering system
Steering system
 
Tire forces and moments
Tire forces and momentsTire forces and moments
Tire forces and moments
 
Tyres
TyresTyres
Tyres
 
ORTHOGONAL, ORTHONORMAL VECTOR, GRAM SCHMIDT PROCESS, ORTHOGONALLY DIAGONALI...
ORTHOGONAL, ORTHONORMAL  VECTOR, GRAM SCHMIDT PROCESS, ORTHOGONALLY DIAGONALI...ORTHOGONAL, ORTHONORMAL  VECTOR, GRAM SCHMIDT PROCESS, ORTHOGONALLY DIAGONALI...
ORTHOGONAL, ORTHONORMAL VECTOR, GRAM SCHMIDT PROCESS, ORTHOGONALLY DIAGONALI...
 
Steering system forces and moments
Steering system forces and momentsSteering system forces and moments
Steering system forces and moments
 
All types of Tires by Ankush Agrawal
All types of Tires by Ankush Agrawal All types of Tires by Ankush Agrawal
All types of Tires by Ankush Agrawal
 
5 steering geometry
5 steering geometry5 steering geometry
5 steering geometry
 
Tyres
TyresTyres
Tyres
 
Vehicle dynamics - Chapter 2 (Road Loads)
Vehicle dynamics - Chapter 2 (Road Loads)Vehicle dynamics - Chapter 2 (Road Loads)
Vehicle dynamics - Chapter 2 (Road Loads)
 
Steering system
Steering systemSteering system
Steering system
 
Mechanical Vibrations all slides
Mechanical Vibrations all slidesMechanical Vibrations all slides
Mechanical Vibrations all slides
 
Mechanical Vibration- An introduction
Mechanical Vibration- An introductionMechanical Vibration- An introduction
Mechanical Vibration- An introduction
 
AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017
AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017
AI and Machine Learning Demystified by Carol Smith at Midwest UX 2017
 

Ähnlich wie Lecture 12 orhogonality - 6.1 6.2 6.3

Lecture 13 gram-schmidt inner product spaces - 6.4 6.7
Lecture 13   gram-schmidt  inner product spaces - 6.4 6.7Lecture 13   gram-schmidt  inner product spaces - 6.4 6.7
Lecture 13 gram-schmidt inner product spaces - 6.4 6.7njit-ronbrown
 
Lecture 3 section 1-7, 1-8 and 1-9
Lecture 3   section 1-7, 1-8 and 1-9Lecture 3   section 1-7, 1-8 and 1-9
Lecture 3 section 1-7, 1-8 and 1-9njit-ronbrown
 
Lecture 7 determinants cramers spaces - section 3-2 3-3 and 4-1
Lecture 7   determinants cramers spaces - section 3-2 3-3 and 4-1Lecture 7   determinants cramers spaces - section 3-2 3-3 and 4-1
Lecture 7 determinants cramers spaces - section 3-2 3-3 and 4-1njit-ronbrown
 
Calculus 05-6 ccccccccccccccccccccccc.ppsx
Calculus 05-6 ccccccccccccccccccccccc.ppsxCalculus 05-6 ccccccccccccccccccccccc.ppsx
Calculus 05-6 ccccccccccccccccccccccc.ppsxMohamedAli899919
 
2. Linear Algebra for Machine Learning: Basis and Dimension
2. Linear Algebra for Machine Learning: Basis and Dimension2. Linear Algebra for Machine Learning: Basis and Dimension
2. Linear Algebra for Machine Learning: Basis and DimensionCeni Babaoglu, PhD
 
Gram-Schmidt process linear algbera
Gram-Schmidt process linear algberaGram-Schmidt process linear algbera
Gram-Schmidt process linear algberaPulakKundu1
 
IRJET- Nano Generalized Delta Semi Closed Sets in Nano Topological Spaces
IRJET-  	  Nano Generalized Delta Semi Closed Sets in Nano Topological SpacesIRJET-  	  Nano Generalized Delta Semi Closed Sets in Nano Topological Spaces
IRJET- Nano Generalized Delta Semi Closed Sets in Nano Topological SpacesIRJET Journal
 
Lecture 9 dim & rank - 4-5 & 4-6
Lecture 9   dim & rank -  4-5 & 4-6Lecture 9   dim & rank -  4-5 & 4-6
Lecture 9 dim & rank - 4-5 & 4-6njit-ronbrown
 
vector space and subspace
vector space and subspacevector space and subspace
vector space and subspace2461998
 
Functional analysis in mechanics 2e
Functional analysis in mechanics  2eFunctional analysis in mechanics  2e
Functional analysis in mechanics 2eSpringer
 
Functional analysis in mechanics
Functional analysis in mechanicsFunctional analysis in mechanics
Functional analysis in mechanicsSpringer
 
Soft Lattice in Approximation Space
Soft Lattice in Approximation SpaceSoft Lattice in Approximation Space
Soft Lattice in Approximation Spaceijtsrd
 
Uniform spaces -_riyas
Uniform spaces -_riyasUniform spaces -_riyas
Uniform spaces -_riyasarathyan
 
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6njit-ronbrown
 
Properties of fuzzy inner product spaces
Properties of fuzzy inner product spacesProperties of fuzzy inner product spaces
Properties of fuzzy inner product spacesijfls
 

Ähnlich wie Lecture 12 orhogonality - 6.1 6.2 6.3 (20)

Lecture 13 gram-schmidt inner product spaces - 6.4 6.7
Lecture 13   gram-schmidt  inner product spaces - 6.4 6.7Lecture 13   gram-schmidt  inner product spaces - 6.4 6.7
Lecture 13 gram-schmidt inner product spaces - 6.4 6.7
 
Lecture 02
Lecture 02Lecture 02
Lecture 02
 
Unit 6.2
Unit 6.2Unit 6.2
Unit 6.2
 
Inner product
Inner productInner product
Inner product
 
Lecture 3 section 1-7, 1-8 and 1-9
Lecture 3   section 1-7, 1-8 and 1-9Lecture 3   section 1-7, 1-8 and 1-9
Lecture 3 section 1-7, 1-8 and 1-9
 
Lecture 7 determinants cramers spaces - section 3-2 3-3 and 4-1
Lecture 7   determinants cramers spaces - section 3-2 3-3 and 4-1Lecture 7   determinants cramers spaces - section 3-2 3-3 and 4-1
Lecture 7 determinants cramers spaces - section 3-2 3-3 and 4-1
 
Unit 6.1
Unit 6.1Unit 6.1
Unit 6.1
 
Calculus 05-6 ccccccccccccccccccccccc.ppsx
Calculus 05-6 ccccccccccccccccccccccc.ppsxCalculus 05-6 ccccccccccccccccccccccc.ppsx
Calculus 05-6 ccccccccccccccccccccccc.ppsx
 
2. Linear Algebra for Machine Learning: Basis and Dimension
2. Linear Algebra for Machine Learning: Basis and Dimension2. Linear Algebra for Machine Learning: Basis and Dimension
2. Linear Algebra for Machine Learning: Basis and Dimension
 
Gram-Schmidt process linear algbera
Gram-Schmidt process linear algberaGram-Schmidt process linear algbera
Gram-Schmidt process linear algbera
 
IRJET- Nano Generalized Delta Semi Closed Sets in Nano Topological Spaces
IRJET-  	  Nano Generalized Delta Semi Closed Sets in Nano Topological SpacesIRJET-  	  Nano Generalized Delta Semi Closed Sets in Nano Topological Spaces
IRJET- Nano Generalized Delta Semi Closed Sets in Nano Topological Spaces
 
Lecture 9 dim & rank - 4-5 & 4-6
Lecture 9   dim & rank -  4-5 & 4-6Lecture 9   dim & rank -  4-5 & 4-6
Lecture 9 dim & rank - 4-5 & 4-6
 
vector space and subspace
vector space and subspacevector space and subspace
vector space and subspace
 
Calculus it
Calculus itCalculus it
Calculus it
 
Functional analysis in mechanics 2e
Functional analysis in mechanics  2eFunctional analysis in mechanics  2e
Functional analysis in mechanics 2e
 
Functional analysis in mechanics
Functional analysis in mechanicsFunctional analysis in mechanics
Functional analysis in mechanics
 
Soft Lattice in Approximation Space
Soft Lattice in Approximation SpaceSoft Lattice in Approximation Space
Soft Lattice in Approximation Space
 
Uniform spaces -_riyas
Uniform spaces -_riyasUniform spaces -_riyas
Uniform spaces -_riyas
 
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6Lecture 8   nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
Lecture 8 nul col bases dim & rank - section 4-2, 4-3, 4-5 & 4-6
 
Properties of fuzzy inner product spaces
Properties of fuzzy inner product spacesProperties of fuzzy inner product spaces
Properties of fuzzy inner product spaces
 

Mehr von njit-ronbrown

Lecture 11 diagonalization & complex eigenvalues - 5-3 & 5-5
Lecture  11   diagonalization & complex eigenvalues -  5-3 & 5-5Lecture  11   diagonalization & complex eigenvalues -  5-3 & 5-5
Lecture 11 diagonalization & complex eigenvalues - 5-3 & 5-5njit-ronbrown
 
Lecture 9 eigenvalues - 5-1 & 5-2
Lecture 9   eigenvalues -  5-1 & 5-2Lecture 9   eigenvalues -  5-1 & 5-2
Lecture 9 eigenvalues - 5-1 & 5-2njit-ronbrown
 
Lecture 6 lu factorization & determinants - section 2-5 2-7 3-1 and 3-2
Lecture 6   lu factorization & determinants - section 2-5 2-7 3-1 and 3-2Lecture 6   lu factorization & determinants - section 2-5 2-7 3-1 and 3-2
Lecture 6 lu factorization & determinants - section 2-5 2-7 3-1 and 3-2njit-ronbrown
 
Lecture 5 inverse of matrices - section 2-2 and 2-3
Lecture 5   inverse of matrices - section 2-2 and 2-3Lecture 5   inverse of matrices - section 2-2 and 2-3
Lecture 5 inverse of matrices - section 2-2 and 2-3njit-ronbrown
 
Lecture 4 chapter 1 review section 2-1
Lecture 4   chapter 1 review section 2-1Lecture 4   chapter 1 review section 2-1
Lecture 4 chapter 1 review section 2-1njit-ronbrown
 
Lecture 4 chapter 1 review section 2-1
Lecture 4   chapter 1 review section 2-1Lecture 4   chapter 1 review section 2-1
Lecture 4 chapter 1 review section 2-1njit-ronbrown
 
Lecture 01 - Section 1.1 & 1.2 Row Operations & Row Reduction
Lecture 01 - Section 1.1 & 1.2 Row Operations & Row ReductionLecture 01 - Section 1.1 & 1.2 Row Operations & Row Reduction
Lecture 01 - Section 1.1 & 1.2 Row Operations & Row Reductionnjit-ronbrown
 
Lecture 01 - Row Operations & Row Reduction
Lecture 01 - Row Operations & Row ReductionLecture 01 - Row Operations & Row Reduction
Lecture 01 - Row Operations & Row Reductionnjit-ronbrown
 
Lecture 20 fundamental theorem of calc - section 5.3
Lecture 20   fundamental theorem of calc - section 5.3Lecture 20   fundamental theorem of calc - section 5.3
Lecture 20 fundamental theorem of calc - section 5.3njit-ronbrown
 
Lecture 18 antiderivatives - section 4.8
Lecture 18   antiderivatives - section 4.8Lecture 18   antiderivatives - section 4.8
Lecture 18 antiderivatives - section 4.8njit-ronbrown
 
Lecture 17 optimization - section 4.6
Lecture 17   optimization - section 4.6Lecture 17   optimization - section 4.6
Lecture 17 optimization - section 4.6njit-ronbrown
 
Lecture 16 graphing - section 4.3
Lecture 16   graphing - section 4.3Lecture 16   graphing - section 4.3
Lecture 16 graphing - section 4.3njit-ronbrown
 
Lecture 15 max min - section 4.2
Lecture 15   max min - section 4.2Lecture 15   max min - section 4.2
Lecture 15 max min - section 4.2njit-ronbrown
 
Lecture 14 related rates - section 4.1
Lecture 14   related rates - section 4.1Lecture 14   related rates - section 4.1
Lecture 14 related rates - section 4.1njit-ronbrown
 
Lecture 13 applications - section 3.8
Lecture 13   applications - section 3.8Lecture 13   applications - section 3.8
Lecture 13 applications - section 3.8njit-ronbrown
 
Lecture 11 implicit differentiation - section 3.5
Lecture 11   implicit differentiation - section 3.5Lecture 11   implicit differentiation - section 3.5
Lecture 11 implicit differentiation - section 3.5njit-ronbrown
 
Lecture 10 chain rule - section 3.4
Lecture 10   chain rule - section 3.4Lecture 10   chain rule - section 3.4
Lecture 10 chain rule - section 3.4njit-ronbrown
 
Lecture 9 derivatives of trig functions - section 3.3
Lecture 9   derivatives of trig functions - section 3.3Lecture 9   derivatives of trig functions - section 3.3
Lecture 9 derivatives of trig functions - section 3.3njit-ronbrown
 
Lecture 8 derivative rules
Lecture 8   derivative rulesLecture 8   derivative rules
Lecture 8 derivative rulesnjit-ronbrown
 
Lecture 7(b) derivative as a function
Lecture 7(b)   derivative as a functionLecture 7(b)   derivative as a function
Lecture 7(b) derivative as a functionnjit-ronbrown
 

Mehr von njit-ronbrown (20)

Lecture 11 diagonalization & complex eigenvalues - 5-3 & 5-5
Lecture  11   diagonalization & complex eigenvalues -  5-3 & 5-5Lecture  11   diagonalization & complex eigenvalues -  5-3 & 5-5
Lecture 11 diagonalization & complex eigenvalues - 5-3 & 5-5
 
Lecture 9 eigenvalues - 5-1 & 5-2
Lecture 9   eigenvalues -  5-1 & 5-2Lecture 9   eigenvalues -  5-1 & 5-2
Lecture 9 eigenvalues - 5-1 & 5-2
 
Lecture 6 lu factorization & determinants - section 2-5 2-7 3-1 and 3-2
Lecture 6   lu factorization & determinants - section 2-5 2-7 3-1 and 3-2Lecture 6   lu factorization & determinants - section 2-5 2-7 3-1 and 3-2
Lecture 6 lu factorization & determinants - section 2-5 2-7 3-1 and 3-2
 
Lecture 5 inverse of matrices - section 2-2 and 2-3
Lecture 5   inverse of matrices - section 2-2 and 2-3Lecture 5   inverse of matrices - section 2-2 and 2-3
Lecture 5 inverse of matrices - section 2-2 and 2-3
 
Lecture 4 chapter 1 review section 2-1
Lecture 4   chapter 1 review section 2-1Lecture 4   chapter 1 review section 2-1
Lecture 4 chapter 1 review section 2-1
 
Lecture 4 chapter 1 review section 2-1
Lecture 4   chapter 1 review section 2-1Lecture 4   chapter 1 review section 2-1
Lecture 4 chapter 1 review section 2-1
 
Lecture 01 - Section 1.1 & 1.2 Row Operations & Row Reduction
Lecture 01 - Section 1.1 & 1.2 Row Operations & Row ReductionLecture 01 - Section 1.1 & 1.2 Row Operations & Row Reduction
Lecture 01 - Section 1.1 & 1.2 Row Operations & Row Reduction
 
Lecture 01 - Row Operations & Row Reduction
Lecture 01 - Row Operations & Row ReductionLecture 01 - Row Operations & Row Reduction
Lecture 01 - Row Operations & Row Reduction
 
Lecture 20 fundamental theorem of calc - section 5.3
Lecture 20   fundamental theorem of calc - section 5.3Lecture 20   fundamental theorem of calc - section 5.3
Lecture 20 fundamental theorem of calc - section 5.3
 
Lecture 18 antiderivatives - section 4.8
Lecture 18   antiderivatives - section 4.8Lecture 18   antiderivatives - section 4.8
Lecture 18 antiderivatives - section 4.8
 
Lecture 17 optimization - section 4.6
Lecture 17   optimization - section 4.6Lecture 17   optimization - section 4.6
Lecture 17 optimization - section 4.6
 
Lecture 16 graphing - section 4.3
Lecture 16   graphing - section 4.3Lecture 16   graphing - section 4.3
Lecture 16 graphing - section 4.3
 
Lecture 15 max min - section 4.2
Lecture 15   max min - section 4.2Lecture 15   max min - section 4.2
Lecture 15 max min - section 4.2
 
Lecture 14 related rates - section 4.1
Lecture 14   related rates - section 4.1Lecture 14   related rates - section 4.1
Lecture 14 related rates - section 4.1
 
Lecture 13 applications - section 3.8
Lecture 13   applications - section 3.8Lecture 13   applications - section 3.8
Lecture 13 applications - section 3.8
 
Lecture 11 implicit differentiation - section 3.5
Lecture 11   implicit differentiation - section 3.5Lecture 11   implicit differentiation - section 3.5
Lecture 11 implicit differentiation - section 3.5
 
Lecture 10 chain rule - section 3.4
Lecture 10   chain rule - section 3.4Lecture 10   chain rule - section 3.4
Lecture 10 chain rule - section 3.4
 
Lecture 9 derivatives of trig functions - section 3.3
Lecture 9   derivatives of trig functions - section 3.3Lecture 9   derivatives of trig functions - section 3.3
Lecture 9 derivatives of trig functions - section 3.3
 
Lecture 8 derivative rules
Lecture 8   derivative rulesLecture 8   derivative rules
Lecture 8 derivative rules
 
Lecture 7(b) derivative as a function
Lecture 7(b)   derivative as a functionLecture 7(b)   derivative as a function
Lecture 7(b) derivative as a function
 

Kürzlich hochgeladen

General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024Janet Corral
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...Sapna Thakur
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 

Kürzlich hochgeladen (20)

Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 

Lecture 12 orhogonality - 6.1 6.2 6.3

  • 1. 6 6.1 © 2012 Pearson Education, Inc. Orthogonality INNER PRODUCT, LENGTH, AND ORTHOGONALITY
  • 2. Slide 6.1- 2© 2012 Pearson Education, Inc. INNER PRODUCT  If u and v are vectors in Rn, then we regard u and v as nx1 matrices  The number uTv is called the inner product of u and v, and it is written as u∙v  aka dot product  Scalar
  • 3. Slide 6.1- 3© 2012 Pearson Education, Inc. INNER PRODUCT If 𝐮 = 𝑢1 𝑢2 ⋮ 𝑢 𝑛 and 𝐯 = 𝑣1 𝑣2 ⋮ 𝑣 𝑛 then the inner product of u and v is 𝑢1 𝑢2 ⋯ 𝑢 𝑛 𝑣1 𝑣2 ⋮ 𝑣 𝑛 = 𝑢1 𝑣1 + 𝑢2 𝑣2 + ⋯ + 𝑢 𝑛 𝑣 𝑛
  • 4. Inner Product - Example  Find u∙v & v∙u, for: 𝐮 = 2 −5 −1 , 𝐯 = 3 2 3 Slide 6.1- 4© 2012 Pearson Education, Inc.
  • 5. INNER PRODUCT - Properties Theorem 1: Let u, v, and w be vectors in Rn, and let c be a scalar. Then a. u•v = v•u b. (u + v)•w = u•w + v•w c. (cu)•v = c(u•v) = u•(cv) d. u•u ≥0, and =0 iff u = 0  Properties (b) and (c) can be combined several times to produce the following useful rule: (c1u1 + c2u2 + … + cnun)•v = c1(u1•v) + … + cp(up•v)
  • 6. Slide 6.1- 6© 2012 Pearson Education, Inc. THE LENGTH OF A VECTOR Definition: The length (or norm) of v: 𝐯 = 𝐯 ∙ 𝐯 = 𝑣1 2 + 𝑣2 2 + ⋯ + 𝑣 𝑛 2  ||cv|| = |c| ||v||  Unit vector has length 1  Any vector divided by its length is a unit vector in same direction  normalizing
  • 7. Slide 6.1- 7© 2012 Pearson Education, Inc. Normalization – Example 1 Example 1: Let v = (1, -2, 2, 0). Find a unit vector u in the same direction as v.
  • 8. Slide 6.1- 8© 2012 Pearson Education, Inc. Normalization – Example 2 Example 2: Let W be a subspace of R2 spanned by x = 2/3 1 . Find a unit vector z that is a basis for W.
  • 9. Slide 6.1- 9© 2012 Pearson Education, Inc. DISTANCE IN Rn Definition: For u and v in Rn, the distance between u and v, written as dist (u, v), is the length of the vector u-v. i.e., dist(u, v) = ||u – v|| Example: Find the distance between u=(7,1) & v=(3,2)
  • 10. Orthogonal Vectors Definition: 2 vectors u and v are orthogonal if u•v = 0. Theorem 6-2: Pythagorean Theorem: 2 vectors u and v are orthogonal iff 𝐮 + 𝐯 2 = 𝐮 2 + 𝐯 2 Slide 6.1- 10© 2012 Pearson Education, Inc.
  • 11. Slide 6.1- 11© 2012 Pearson Education, Inc. Orthogonal Complement  If a vector z is orthogonal to every vector in a subspace W of Rn, then z is said to be orthogonal to W.  The set of all vectors z that are orthogonal to W is called the orthogonal complement of W and is denoted by W┴ (and read as “W perpendicular” or simply “W perp”).
  • 12. ORTHOGONAL COMPLEMENTS Theorem 3: Let A be an mxn matrix. The orthogonal complement of the row space of A is the null space of A, and the orthogonal complement of the column space of A is the null space of AT: Proof: ai = ith row of A Row(A) = Span{rows of A} Nul(A) = x:Ax = 0 ai • x = 0 for every i x is orthogonal to each row of A (Row ) NulA A  (Col ) Nul T A A 
  • 13. Slide 6.1- 13© 2012 Pearson Education, Inc. ANGLES & Inner Products u•v = ||u|| ||v|| cosϑ
  • 14. Slide 6.1- 14© 2012 Pearson Education, Inc. ANGLES & Inner Products - Example Find the angle between 1 1 0 , 0 1 1
  • 15. 6 6.1 © 2012 Pearson Education, Inc. Orthogonal Sets
  • 16. Slide 6.2- 16© 2012 Pearson Education, Inc. ORTHOGONAL SETS  A set of vectors {u1,…,up} in Rn is said to be an orthogonal set if each pair of distinct vectors from the set is orthogonal, i.e, if ui •uj = 0 if i ≠ j.
  • 17. Orthogonal Sets - Example Show that S={u1,u2,u3} is an orthogonal set. 𝐮1 = 3 1 1 , 𝐮2 = −1 2 1 , 𝐮1 = −1/2 −2 7/2 Slide 6.1- 17© 2012 Pearson Education, Inc.
  • 18. Theorem 6-4 – Orthogonal Set as a Basis Theorem 4: If S = {u1, …, up} is an orthogonal set of nonzero vectors in Rn , then S is linearly independent and hence is a basis for the subspace spanned by S. Slide 6.1- 18© 2012 Pearson Education, Inc.
  • 19. Slide 6.2- 19© 2012 Pearson Education, Inc. ORTHOGONAL Basis Definition: An orthogonal basis for a subspace W of Rn is a basis for W that is also an orthogonal set. Theorem 5: Let {u1,…,up} be an orthogonal basis for a subspace W of Rn. For each y in W, the weights in the linear combination are given by y = c1u1+ … + cpup 𝑐𝑗 = 𝐲∙𝐮 𝒋 𝐮 𝒋∙𝐮 𝒋 (𝑗 = 1, … , 𝑝)
  • 20. Orthogonal Basis – Example 2 Same S as in Example 1: 𝐮1 = 3 1 1 , 𝐮2 = −1 2 1 , 𝐮1 = −1/2 −2 7/2 write 𝐲 = 6 1 −8 as a linear combination of the vectors in S. Slide 6.1- 20© 2012 Pearson Education, Inc.
  • 21. Slide 6.2- 21© 2012 Pearson Education, Inc. AN ORTHOGONAL PROJECTION Decompose y into 2 vectors:  One in direction of u  ŷ = αu  One orthogonal to u  z = y - ŷ
  • 22. Orthogonal Projections  ŷ is the projection of y onto u  If L is subspace spanned by u  ŷ is the projection of y onto L 𝐲 = proj 𝐿 𝐲 = 𝐲 ∙ 𝐮 𝐮 ∙ 𝐮 𝐮 Slide 6.1- 22© 2012 Pearson Education, Inc.
  • 23. Slide 6.2- 23© 2012 Pearson Education, Inc. ORTHOGONAL PROJECTION - Example Example 1: 𝐲 = 7 6 , 𝐮 = 4 2 a) Find the orthogonal projection of y onto u. b) Write y as the sum of two orthogonal vectors, one in Span {u} and one orthogonal to u.
  • 24. ORTHOGONAL PROJECTION - Example 7 8 1 6 4 2                    ˆyy ˆ(y y)
  • 25. Slide 6.2- 25© 2012 Pearson Education, Inc. AN ORTHOGONAL PROJECTION  Check: ŷ•z = 0?  Distance from y to L?  Distance from y to u?
  • 26. Slide 6.2- 26© 2012 Pearson Education, Inc. ORTHONORMAL SETS  A set {u1,…,up} is an orthonormal set if it is an orthogonal set of unit vectors (||u||=1).  If W is the subspace spanned by such a set, then {u1,…,up} is an orthonormal basis for W, since the set is automatically linearly independent, by Theorem 4.  The simplest example of an orthonormal set is the standard basis {e1,…,en} for Rn.
  • 27. Slide 6.2- 27© 2012 Pearson Education, Inc. ORTHONORMAL SETS - Example Example 2: Show that {v1, v2, v3} is an orthonormal basis of R3, where 1 3/ 11 v 1/ 11 1/ 11             2 1/ 6 v 2 / 6 1/ 6             3 1/ 66 v 4 / 66 7 / 66            
  • 28. Slide 6.2- 28© 2012 Pearson Education, Inc. ORTHONORMAL SETS  When the vectors in an orthogonal set of nonzero vectors are normalized to have unit length, the new vectors will still be orthogonal, and hence the new set will be an orthonormal set.
  • 29. Slide 6.2- 29© 2012 Pearson Education, Inc. ORTHONORMAL SETS Theorem 6: An mxn matrix U has orthonormal columns if and only if UTU = I
  • 30. Slide 6.2- 30© 2012 Pearson Education, Inc. ORTHONORMAL SETS Theorem 7: Let U be an mxn matrix with orthonormal columns, and let x and y be in Rn. Then a. ||Ux|| = ||x|| b. (Ux)•(Uy) = x•y c. (Ux)•(Uy) = 0 iff x•y=0  Properties (a) and (c) say that the linear mapping x Ux preserves lengths and orthogonality.
  • 31. Orthonormal Sets - Example 𝑈 = 1 2 2 3 1 2 2/3 0 1/3 , 𝐱 = 2 3 Slide 6.1- 31© 2012 Pearson Education, Inc.
  • 32. Orthogonal Matrix  If U is square and U-1=UT  Called “Orthogonal Matrix”  Really should be Orthonormal matrix  Orthonormal columns  Orthonormal rows as well Slide 6.1- 32© 2012 Pearson Education, Inc.
  • 33. 6 6.1 © 2012 Pearson Education, Inc. Orthogonal Projections
  • 34. Slide 6.3- 34© 2012 Pearson Education, Inc. ORTHOGONAL PROJECTIONS  Extend ideas of previous section from R2 to Rn  Given a vector y and a subspace W in Rn, there is a vector ŷ in W such that  ŷ is the unique vector in W for which y - ŷ is orthogonal to W  ŷ is the unique vector in W closest to y. See the following figure. ˆy
  • 35. Slide 6.3- 35© 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM Theorem 8: Let W be a subspace of Rn . Then each y in Rn can be written uniquely in the form y = ŷ + z where ŷ is in W and z is in W┴ . If {u1,…,up} is any orthogonal basis of W, then 𝐲 = 𝐲 ∙ 𝐮1 𝐮1 ∙ 𝐮1 𝐮1 + ⋯ + 𝐲 ∙ 𝐮 𝑝 𝐮 𝑝 ∙ 𝐮 𝑝 𝐮 𝑝 and z = y - ŷ
  • 36. Slide 6.3- 36© 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  The vector ŷ is called the orthogonal projection of y onto W and often is written as projWy. See the following figure.
  • 37. Slide 6.3- 37© 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM - Example Example 1: Write y as the sum of a vector in W = span{u1,u2}, and a vector orthogonal to W. 𝐮1 = 2 5 −1 , 𝐮2 = −2 1 1 , 𝐲 = 1 2 3
  • 38. Slide 6.3- 38© 2012 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS  If {u1,…,up} is an orthogonal basis for W and if y happens to be in W, then the formula for projWy is exactly the same as the representation of y given in Theorem 5 in Section 6.2.  If y is in W = Span {u1,…,up}, projwy = y
  • 39. Slide 6.3- 39© 2012 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM Theorem 9: Let W be a subspace of Rn, let y be any vector in Rn, and let ŷ be the orthogonal projection of y onto W. Then ŷ is the closest point in W to y, in the sense that ||y – ŷ|| < ||y – v|| for all v in W distinct from ŷ.
  • 40. The Best Approximation Theorem  ŷ in Theorem 9 is called the best approximation to y by elements of W.  The distance from y to v, given by ||y – v||, can be regarded as the “error” of using v in place of y.  Theorem 9 says that this error is minimized when v=ŷ  Note – does not depend of basis of W used Slide 6.1- 40© 2012 Pearson Education, Inc.
  • 41. Slide 6.3- 41© 2012 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM
  • 42. Slide 6.3- 42© 2012 Pearson Education, Inc. Best Approximation - Example Example 2: Find the distance from y to W = Span {u1,u2} for: 𝐮1 = 5 −2 1 , 𝐮2 = 1 2 −1 , 𝐲 = −1 −5 10