SlideShare ist ein Scribd-Unternehmen logo
1 von 65
Downloaden Sie, um offline zu lesen
Review for Midterm I

                          Math 20


                      October 16, 2007



Announcements
   Midterm I 10/18, Hall A 7–8:30pm
   ML Office Hours Wednesday 1–3 (SC 323)
   SS review session Wednesday 8–9pm (location TBA)
   Old exams and solutions on website
Outline


                                       Determinants by sudoku
   Vector and Matrix Algebra
                                       patterns
      Vectors
                                       Determinants by Cofactors
      Algebra of vectors
                                   Systems of Linear Equations
      Scalar Product
      Matrices                     Inversion
      Addition and scalar              Determinants and the
      multiplication of matrices       inverse
                                       Inverses of 2 × 2 matrices
      Matrix Multiplication
      The Transpose                    Computing inverses with the
   Geometry                            adjoint matrix
      Lines                            Computing inverses with row
      Planes                           reduction
   Determinants                        Properties of the inverse
Vector and Matrix Algebra
Learning Objectives




          Add, scale, and compute linear combinations of vectors
          Compute the scalar product of two vectors
          Add and scale matrices
          Multiply matrices
    all of these with the caveat of “if possible.”
Vectors



   Definition
   An n-vector (or simply vector) is an ordered list of n numbers.
   We can write them in a row or in a column.
                         
                           1
                                    b = (1, 0, −1)
                         2
                    a=
                           3

   In linear algebra we mostly work with column vectors.
Algebra of vectors

   Definition
   Addition of vectors is defined componentwise, as is scalar
   multiplication:
                                            
                        a1       b1     a1 + b 1
                      a2   b 2   a2 + b 2 
                      . + . = . 
                                            
                         . .  . 
                     .           .        .
                       an       bn        an + bn
                               
                              a1    ta1
                             a2  ta2 
                            t . = . 
                               
                             .  . 
                               .     .
                              an    tan
Dot product



   Definition
   Given two vectors of the same dimension (size), their scalar
   product (or “dot product”) is the sum of the product of the
   components of the vectors:
           
           a1      b1
                                                         n
         a2  b2 
          .  ·  .  = a1 b1 + a2 b2 + · · · + an bn =   ai bi
           
         . .
            .       .                                   i=1
           an      bn
Matrices


   Definition
   An m × n matrix is a rectangular array of mn numbers arranged in
   m horizontal rows and n vertical columns.
                                                   
                      a11 a12 · · · a1j · · · a1n
                     a21 a22 · · · a2j · · · a2n 
                                                   
                    .       .          .         .
                                  ..         ..
                       .     .          .         .
                                     .          .
                    .       .          .         .
                A=  ai1 ai2 · · · aij · · · ain 
                                                   
                       .     .          .         .
                                  ..         ..
                    .       .          .         .
                                     .          .
                    .       .          .         .
                      am1 am2 · · · amj · · · amn
Addition and scalar multiplication of matrices


   Matrices can be added and scaled just like vectors:
   Example

                           1 −1
                      12                      21
                         +                =
                      34   02                 36


   Example

                             11         44
                        4           =
                            −1 2        −4 8
The matrix-vector product

   Definition                                       
                                              v1
                                             v 
   Let A = (aij ) be an m × n matrix and v =  2  a n-vector
                                             . . . 
                                              vn
   (column vector). The matrix-vector product of A and v is the
                 
                   w1
                  w2 
   vector Av =  , where
                 . . .
                   wm
                                                          n
               wk = ak1 v1 + ak2 v2 + · · · + akn vn =         akj vj ,
                                                         j=1

   the dot product of kth row of A with v.
Example
Let
                   
                23
                                   2
           A = −1 4   and   v=
                                   −1
                03

Find Av.
Example
Let
                   
                23
                                          2
           A = −1 4          and   v=
                                          −1
                03

Find Av.

Solution
                              
              2 · 2 + 3 · (−1)
      Av = (−1) · 2 + 4 · (−1)
              0 · 2 + 3 · (−1)
Example
Let
                   
                23
                                              2
           A = −1 4        and        v=
                                              −1
                03

Find Av.

Solution
                                       
              2 · 2 + 3 · (−1)      4−1
      Av = (−1) · 2 + 4 · (−1) = −2 − 4
              0 · 2 + 3 · (−1)      0−3
Example
Let
                   
                23
                                             2
           A = −1 4        and        v=
                                             −1
                03

Find Av.

Solution
                                       
              2 · 2 + 3 · (−1)      4−1         1
           (−1) · 2 + 4 · (−1) = −2 − 4 = −6 .
      Av =
              0 · 2 + 3 · (−1)      0−3        −3
Matrix product, defined
   Definition
   Let A be an m × n matrix and B a n × p matrix. Then the matrix
   product of A and B is the m × p matrix whose jth column is Abj .
   In other words, the (i, j)th entry of AB is the dot product of ith
   row of A and the jth column of B. In symbols
                                      n
                          (AB)ij =         aik bkj .
                                     k=1


   Example

                                                 
     1.5 0.5 1                   125 115 110 105
                             
    0 0.25 0 ‘70 60 50 40      5    7.5 7.5 7.5 
                                                 
   1.5 0.25 0  20 30 30 30 = 100 97.5 82.5 67.5
                                                 
   2     2  3 10 10 20 30      210 210 220 230 
      3   2  2                    270 260 250 240
The Transpose
  There is another operation on matrices, which is just flipping rows
  and columns.
  Definition
  Let A = (aij )m×n be a matrix. The transpose of A is the matrix
  A = (aij )n×m whose (i, j)th entry is aji .
The Transpose
  There is another operation on matrices, which is just flipping rows
  and columns.
  Definition
  Let A = (aij )m×n be a matrix. The transpose of A is the matrix
  A = (aij )n×m whose (i, j)th entry is aji .
  Example    
           12
  Let A = 3 4. Then
           56

                          A=
The Transpose
  There is another operation on matrices, which is just flipping rows
  and columns.
  Definition
  Let A = (aij )m×n be a matrix. The transpose of A is the matrix
  A = (aij )n×m whose (i, j)th entry is aji .
  Example    
           12
  Let A = 3 4. Then
           56

                                 135
                          A=         .
                                 246
The Transpose
  There is another operation on matrices, which is just flipping rows
  and columns.
  Definition
  Let A = (aij )m×n be a matrix. The transpose of A is the matrix
  A = (aij )n×m whose (i, j)th entry is aji .
  Example    
           12
  Let A = 3 4. Then
           56

                                 135
                          A=         .
                                 246


  Fact
  Given matrices A and B, of suitable dimensions,

                            (AB) = B A
Outline


                                       Determinants by sudoku
   Vector and Matrix Algebra
                                       patterns
      Vectors
                                       Determinants by Cofactors
      Algebra of vectors
                                   Systems of Linear Equations
      Scalar Product
      Matrices                     Inversion
      Addition and scalar              Determinants and the
      multiplication of matrices       inverse
                                       Inverses of 2 × 2 matrices
      Matrix Multiplication
      The Transpose                    Computing inverses with the
   Geometry                            adjoint matrix
      Lines                            Computing inverses with row
      Planes                           reduction
   Determinants                        Properties of the inverse
Geometry
Learning Objectives




          Given a point and a line, decide if the point is on the line
          Given a point and a direction, or two points, find the equation
          for the line
          Given a point and a plane, decide of the point is on the plane
          Given a point and a normal vector, find the equation for the
          plane
Lines



   Definition
   The line L through (the head of) a parallel to v is the set of all x
   such that
                              x = a + tv
   for some real number t.
   The line L through (the heads of) a and b is

                             x = a + t(b − a)
                              = (1 − t)a + tb
Example
Find an equation for the line through (−2, 0, 4) parallel to
(2, 4, −2)
Example
Find an equation for the line through (−2, 0, 4) parallel to
(2, 4, −2)

Solution
x = (−2, 0, 4) + t(2, 4, −2)
Example
Find an equation for the line through (−3, 2, −3) and (1, −1, 4)
Example
Find an equation for the line through (−3, 2, −3) and (1, −1, 4)
Is (1, 2, 3) on this line?
Example
Find an equation for the line through (−3, 2, −3) and (1, −1, 4)
Is (1, 2, 3) on this line?
Answer.
No!
Planes




   Definition
   A hyperplane through (the head of) a that is orthogonal to a
   vector p is the set of all (heads of) vectors x such that

                            p · (x − a) = 0
Example
Find an equation for the plane through (−3, 0, 7) perpendicular to
(5, 2, −1)
Example
Find an equation for the plane through (−3, 0, 7) perpendicular to
(5, 2, −1)

Solution
5x + 2y − z = 22
Example
Find an equation for the plane through (−3, 0, 7) perpendicular to
(5, 2, −1)

Solution
5x + 2y − z = 22

Question
Is (1, 2, 3) on this plane? Is there a point on the plane with z = 0?
Fact
If the equation for a plane is

                        Ax + By + Cz = D,

then p = (A, B, C ) is normal to the plane.
Outline


                                       Determinants by sudoku
   Vector and Matrix Algebra
                                       patterns
      Vectors
                                       Determinants by Cofactors
      Algebra of vectors
                                   Systems of Linear Equations
      Scalar Product
      Matrices                     Inversion
      Addition and scalar              Determinants and the
      multiplication of matrices       inverse
                                       Inverses of 2 × 2 matrices
      Matrix Multiplication
      The Transpose                    Computing inverses with the
   Geometry                            adjoint matrix
      Lines                            Computing inverses with row
      Planes                           reduction
   Determinants                        Properties of the inverse
Determinants
Learning Objectives




          Compute determinants of n × n matrices.
The determinant




   Definition
                                           a11 a12
   The determinant of a 2 × 2 matrix A =             is the number
                                           a21 a22

                     a11 a12
                             = a11 a22 − a21 a12
                     a21 a22
Definition
The determinant of a 3 × 3 matrix is

   a11 a12 a13
   a21 a22 a23 = a11 a22 a33 − a11 a23 a32 − a21 a12 a33
   a31 a32 a33
                             + a21 a13 a32 + a31 a12 a23 − a31 a22 a13
Sarrus’s Rule
Sarrus’s Rule




        a11 a12 a13        a11 a22 a33 + a12 a23 a31
                      =
                          +a13 a21 a32 − a13 a22 a31
        a21 a22 a23
                          −a11 a23 a32 − a12 a21 a33
        a31 a32 a33
Sarrus’s Rule




        a11 a12 a13 a11 a12        a11 a22 a33 + a12 a23 a31
                              =
                                  +a13 a21 a32 − a13 a22 a31
        a21 a22 a23 a21 a22
                                  −a11 a23 a32 − a12 a21 a33
        a31 a32 a33 a31 a32
Sarrus’s Rule




        a11 a12 a13 a11 a12        a11 a22 a33 + a12 a23 a31
                              =
                                  +a13 a21 a32 − a13 a22 a31
        a21 a22 a23 a21 a22
                                  −a11 a23 a32 − a12 a21 a33
        a31 a32 a33 a31 a32
Sarrus’s Rule




        a11 a12 a13 a11 a12        a11 a22 a33 + a12 a23 a31
                              =
                                  +a13 a21 a32 − a13 a22 a31
        a21 a22 a23 a21 a22
                                  −a11 a23 a32 − a12 a21 a33
        a31 a32 a33 a31 a32
Sarrus’s Rule




        a11 a12 a13 a11 a12           a11 a22 a33 + a12 a23 a31
                               =
                                     +a13 a21 a32 − a13 a22 a31
        a21 a22 a23 a21 a22
                                     −a11 a23 a32 − a12 a21 a33
        a31 a32 a33 a31 a32

   This trick does not work for any other determinants!
Determinants by sudoku patterns




   Definition
   Let A = (aij )n×n be a matrix. The determinant of A is a sum of
   all products of n elements of the matrix, where each product takes
   exactly one entry from each row and column.
Determinants by sudoku patterns




   Definition
   Let A = (aij )n×n be a matrix. The determinant of A is a sum of
   all products of n elements of the matrix, where each product takes
   exactly one entry from each row and column.
   The sign of each product is given by (−1)σ , where σ is the number
   of upwards lines used when all the entries in a pattern are
   connected.
4 × 4 sudoku patterns



             −          −           −
      +                     +   +



      −                     −   −
             +          +           +



             −          −           −
      +                     +   +



      −                     −   −
             +          +           +
Determinants by Cofactors




   Definition
   Let A = (aij )n×n be a matrix. The (i, j)-minor of A is the matrix
   obtained from A by deleting the ith row and j column. This matrix
   has dimensions (n − 1) × (n − 1).
   The (i, j) cofactor of A is the determinant of the (i, j) minor
   times (−1)i+j .
The signs here seem complicated, but they’re not. The number
(−1)i+j is 1 if i and j are both even or both odd, and −1
otherwise. They make a checkerboard pattern:
                                          
                            + − + ···
                          − + − · · ·
                                          
                          + − + · · ·
                                          
                            . . . ..
                            ...          .
                            ...
Fact
The determinant of A = (aij )n×n is the sum

                 a11 C11 + a12 C12 + · · · + a1n C1n
Fact
The determinant of A = (aij )n×n is the sum

                 a11 C11 + a12 C12 + · · · + a1n C1n


Fact
The determinant of A = (aij )n×n is the sum

                  a11 Ci1 + ai2 Ci2 + · · · + ain Cin

for any i.
Fact
The determinant of A = (aij )n×n is the sum

                 a11 C11 + a12 C12 + · · · + a1n C1n


Fact
The determinant of A = (aij )n×n is the sum

                  a11 Ci1 + ai2 Ci2 + · · · + ain Cin

for any i.

Fact
The determinant of A = (aij )n×n is the sum

                  a1j C1j + a2j C2j + · · · + anj Cnj

for any j.
Theorem (Rules for Determinants)
Let A be an n × n matrix.
 1. If a row or column of A is full of zeros, then |A| = 0.
 2. |A | = |A|
 3. If B is the matrix obtained by multiplying each entry of one
    row or column of A by the same number α, then |B| = α |A|.
 4. If two rows or columns of A are interchanged, then the
    determinant changes its sign but keeps its absolute value.
 5. If a row or column of A is duplicated, then |A| = 0.
Theorem (Rules for Determinants, continued)
Let A be an n × n matrix.
 5. If a row or column of A is proportional to another, then
    |A| = 0.
 6. If a scalar multiple of one row (or column) of A is added to
    another row (or column), then the determinant does not
    change.
 7. The determinant of the product of two matrices is the product
    of the determinants of those matrices:

                            |AB| = |A| |B|

 8. if α is any real number, then |αA| = αn |A|.
Outline


                                       Determinants by sudoku
   Vector and Matrix Algebra
                                       patterns
      Vectors
                                       Determinants by Cofactors
      Algebra of vectors
                                   Systems of Linear Equations
      Scalar Product
      Matrices                     Inversion
      Addition and scalar              Determinants and the
      multiplication of matrices       inverse
                                       Inverses of 2 × 2 matrices
      Matrix Multiplication
      The Transpose                    Computing inverses with the
   Geometry                            adjoint matrix
      Lines                            Computing inverses with row
      Planes                           reduction
   Determinants                        Properties of the inverse
Systems of Linear Equations
Learning Objectives




          Solve a System of Linear Equations using Gaussian Elimination
          Find the (R)REF of a matrix.
Row Operations



  The operations on systems of linear equations are reflected in the
  augmented matrix.
    1. Transposing (switching) rows in an augmented matrix does
       not change the solution.
    2. Scaling any row in an augmented matrix does not change the
       solution.
    3. Adding to any row in an augmented matrix any multiple of
       any other row in the matrix does not change the solution.
Gaussian Elimination
    1. Locate the first nonzero column. This is pivot column, and
       the top row in this column is called a pivot position.
       Transpose rows to make sure this position has a nonzero entry.
       If you like, scale the row to make this position equal to one.
    2. Use row operations to make all entries below the pivot
       position zero.
    3. Repeat Steps 1 and 2 on the submatrix below the first row
       and to the right of the first column. Finally, you will arrive at
       a matrix in row echelon form. (up to here is called the
       forward pass)
    4. Scale the bottom row to make the leading entry one.
    5. Use row operations to make all entries above this entry zero.
    6. Repeat Steps 4 and 5 on the submatrix formed above and to
       the left of this entry. (These steps are called the backward
       pass)
Example
Solve the system of linear equations

                         2x+ 8y +4z =2
                         2x+ 5y + z =5
                         4x+10y −4z =1
Outline


                                       Determinants by sudoku
   Vector and Matrix Algebra
                                       patterns
      Vectors
                                       Determinants by Cofactors
      Algebra of vectors
                                   Systems of Linear Equations
      Scalar Product
      Matrices                     Inversion
      Addition and scalar              Determinants and the
      multiplication of matrices       inverse
                                       Inverses of 2 × 2 matrices
      Matrix Multiplication
      The Transpose                    Computing inverses with the
   Geometry                            adjoint matrix
      Lines                            Computing inverses with row
      Planes                           reduction
   Determinants                        Properties of the inverse
Inversion
Learning Objectives




          Use the determinant to determine whether a matrix is
          invertible
          Find the inverse of a matrix
Determinants and the inverse



   If A has an inverse A−1 , what is |A|−1 ?
   Answer.

                       |A| A−1 = AA−1 = |I| = 1


   Fact
   A is invertible if and only if |A| = 0.
Inverses of 2 × 2 matrices



               ab
   Let A =        . This is small enough that we can explicitly solve
               cd
   for A−1 .
   Fact
   If ad − bc = 0, then
                           −1
                                              d −b
                                       1
                     ab
                                =                  .
                                              −c a
                     cd             ad − bc
Cofactors


   Remember how we computed determinants: Given A, Cij was
   (−1)i+j times the determinant of Aij , which was A with the ith
   row and j column deleted.
   Let C+ = (Cij ). Then

                            A(C+ ) = |A| I

   So
                                    1
                           A−1 =       (C+ )
                                   |A|
   We denote by adj A the matrix (C+ ) , the adjoint matrix of A.
Computing inverses with row reduction




   To find the inverse of A, form the augmented matrix A I and
   row reduce. If the RREF has the form I B then B = A−1 .
   Otherwise, A is not invertible.
Example                        
                         0 2 −3
Find the inverse of A = −2 1 2  using row reduction.
                         201

   0 2 −3 1 0 0 ←
               −
                                             
                                  20   1001
 −2 1                          −2 1  2 0 1 0 ← +
                                                  −
        2 0 1 0
                                  0 2 −3 1 0 0
   20   1001 ←   −
                                             
                                 20   1001
                               0 1   3 0 1 1  −2
                                 0 2 −3 1 0 0 ← +−
                                                
                                 20   10   0   1
                               0 1   30   1   1
                                 0 0 −9 1 −2 −2 ×−1/9
1 ← − −+
                                −−
  
   201      0          0
                             1 ← +
                                 −
  0 1 3    0          1
   0 0 1 −1/9        2/9   2/9
                                   −3 −1
                                                 ×1/2
         
                                   −2/9
             2        00   1/9            7/9
         0           10   1/3      1/3   1/3   
                      0 1 −1/9
             0                      2/9   2/9
                                                               
                                   0 0 1/18 −2/18
                             1                           7/18
                          0       10   1/3   1/3         1/3   
                                   0 1 −1/9
                             0                2/9         2/9


Notice we get the same matrix as with the adjoint method.
Properties of the inverse




   Theorem (Properties of the Inverse)
   Let A and B be invertible n × n matrices. Then
   (a) (A−1 )−1 = A
   (b) (AB)−1 = B−1 A−1
   (c) (A )−1 = (A−1 )
   (d) If c is any nonzero number, (cA)−1 = c −1 A−1 .

Weitere ähnliche Inhalte

Was ist angesagt?

Scatter diagrams and correlation and simple linear regresssion
Scatter diagrams and correlation and simple linear regresssionScatter diagrams and correlation and simple linear regresssion
Scatter diagrams and correlation and simple linear regresssionAnkit Katiyar
 
Mth 4108-1 b (ans)
Mth 4108-1 b (ans)Mth 4108-1 b (ans)
Mth 4108-1 b (ans)outdoorjohn
 
Dependable direct solutions for linear systems using a little extra precision
Dependable direct solutions for linear systems using a little extra precisionDependable direct solutions for linear systems using a little extra precision
Dependable direct solutions for linear systems using a little extra precisionJason Riedy
 
Learning Kinematics from direct Self-Observation using Nearest-Neighbour Methods
Learning Kinematics from direct Self-Observation using Nearest-Neighbour MethodsLearning Kinematics from direct Self-Observation using Nearest-Neighbour Methods
Learning Kinematics from direct Self-Observation using Nearest-Neighbour Methodscijat
 

Was ist angesagt? (6)

Scatter diagrams and correlation and simple linear regresssion
Scatter diagrams and correlation and simple linear regresssionScatter diagrams and correlation and simple linear regresssion
Scatter diagrams and correlation and simple linear regresssion
 
Bab iii
Bab iiiBab iii
Bab iii
 
Mth 4108-1 b (ans)
Mth 4108-1 b (ans)Mth 4108-1 b (ans)
Mth 4108-1 b (ans)
 
Dependable direct solutions for linear systems using a little extra precision
Dependable direct solutions for linear systems using a little extra precisionDependable direct solutions for linear systems using a little extra precision
Dependable direct solutions for linear systems using a little extra precision
 
Ism et chapter_12
Ism et chapter_12Ism et chapter_12
Ism et chapter_12
 
Learning Kinematics from direct Self-Observation using Nearest-Neighbour Methods
Learning Kinematics from direct Self-Observation using Nearest-Neighbour MethodsLearning Kinematics from direct Self-Observation using Nearest-Neighbour Methods
Learning Kinematics from direct Self-Observation using Nearest-Neighbour Methods
 

Ähnlich wie Review for Midterm I Math 20

Midterm II Review Session Slides
Midterm II Review Session SlidesMidterm II Review Session Slides
Midterm II Review Session SlidesMatthew Leingang
 
Lesson 9: Gaussian Elimination
Lesson 9: Gaussian EliminationLesson 9: Gaussian Elimination
Lesson 9: Gaussian EliminationMatthew Leingang
 
Introduction about Geometric Algebra
Introduction about Geometric AlgebraIntroduction about Geometric Algebra
Introduction about Geometric AlgebraVitor Pamplona
 
Lesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsLesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsMatthew Leingang
 
Lesson03 Dot Product And Matrix Multiplication Slides Notes
Lesson03    Dot  Product And  Matrix  Multiplication Slides NotesLesson03    Dot  Product And  Matrix  Multiplication Slides Notes
Lesson03 Dot Product And Matrix Multiplication Slides NotesMatthew Leingang
 
Lesson31 Higher Dimensional First Order Difference Equations Slides
Lesson31   Higher Dimensional First Order Difference Equations SlidesLesson31   Higher Dimensional First Order Difference Equations Slides
Lesson31 Higher Dimensional First Order Difference Equations SlidesMatthew Leingang
 
X2 T08 03 inequalities & graphs
X2 T08 03 inequalities & graphsX2 T08 03 inequalities & graphs
X2 T08 03 inequalities & graphsNigel Simmons
 
Lesson 35: Game Theory and Linear Programming
Lesson 35: Game Theory and Linear ProgrammingLesson 35: Game Theory and Linear Programming
Lesson 35: Game Theory and Linear ProgrammingMatthew Leingang
 
3.1 Quadratic Functions and Models
3.1 Quadratic Functions and Models3.1 Quadratic Functions and Models
3.1 Quadratic Functions and Modelssmiller5
 
Lesson 26: Optimization II: Data Fitting
Lesson 26: Optimization II: Data FittingLesson 26: Optimization II: Data Fitting
Lesson 26: Optimization II: Data FittingMatthew Leingang
 
Lesson 22: Quadratic Forms
Lesson 22: Quadratic FormsLesson 22: Quadratic Forms
Lesson 22: Quadratic FormsMatthew Leingang
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKedadk
 
Computazione quantistica con i fotoni -P. Mataloni
Computazione quantistica con i fotoni -P. MataloniComputazione quantistica con i fotoni -P. Mataloni
Computazione quantistica con i fotoni -P. Mataloninipslab
 
Lesson 34: Introduction To Game Theory
Lesson 34: Introduction To Game TheoryLesson 34: Introduction To Game Theory
Lesson 34: Introduction To Game TheoryMatthew Leingang
 
Integrated 2 Section 2-2
Integrated 2 Section 2-2Integrated 2 Section 2-2
Integrated 2 Section 2-2Jimbo Lamb
 
11 x1 t10 01 graphing quadratics (2013)
11 x1 t10 01 graphing quadratics (2013)11 x1 t10 01 graphing quadratics (2013)
11 x1 t10 01 graphing quadratics (2013)Nigel Simmons
 
IE-002 Control Chart For Variables
IE-002 Control Chart For VariablesIE-002 Control Chart For Variables
IE-002 Control Chart For Variableshandbook
 
Eigenvalues in a Nutshell
Eigenvalues in a NutshellEigenvalues in a Nutshell
Eigenvalues in a Nutshellguest9006ab
 

Ähnlich wie Review for Midterm I Math 20 (20)

Midterm II Review Session Slides
Midterm II Review Session SlidesMidterm II Review Session Slides
Midterm II Review Session Slides
 
Lesson 9: Gaussian Elimination
Lesson 9: Gaussian EliminationLesson 9: Gaussian Elimination
Lesson 9: Gaussian Elimination
 
Introduction about Geometric Algebra
Introduction about Geometric AlgebraIntroduction about Geometric Algebra
Introduction about Geometric Algebra
 
Lesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsLesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear Equations
 
Lesson03 Dot Product And Matrix Multiplication Slides Notes
Lesson03    Dot  Product And  Matrix  Multiplication Slides NotesLesson03    Dot  Product And  Matrix  Multiplication Slides Notes
Lesson03 Dot Product And Matrix Multiplication Slides Notes
 
Lesson31 Higher Dimensional First Order Difference Equations Slides
Lesson31   Higher Dimensional First Order Difference Equations SlidesLesson31   Higher Dimensional First Order Difference Equations Slides
Lesson31 Higher Dimensional First Order Difference Equations Slides
 
X2 T08 03 inequalities & graphs
X2 T08 03 inequalities & graphsX2 T08 03 inequalities & graphs
X2 T08 03 inequalities & graphs
 
Lesson 35: Game Theory and Linear Programming
Lesson 35: Game Theory and Linear ProgrammingLesson 35: Game Theory and Linear Programming
Lesson 35: Game Theory and Linear Programming
 
3.1 Quadratic Functions and Models
3.1 Quadratic Functions and Models3.1 Quadratic Functions and Models
3.1 Quadratic Functions and Models
 
Lesson 26: Optimization II: Data Fitting
Lesson 26: Optimization II: Data FittingLesson 26: Optimization II: Data Fitting
Lesson 26: Optimization II: Data Fitting
 
Lesson 22: Quadratic Forms
Lesson 22: Quadratic FormsLesson 22: Quadratic Forms
Lesson 22: Quadratic Forms
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEK
 
CP Violation in B meson and Belle
CP Violation in B meson and BelleCP Violation in B meson and Belle
CP Violation in B meson and Belle
 
Computazione quantistica con i fotoni -P. Mataloni
Computazione quantistica con i fotoni -P. MataloniComputazione quantistica con i fotoni -P. Mataloni
Computazione quantistica con i fotoni -P. Mataloni
 
Lesson 34: Introduction To Game Theory
Lesson 34: Introduction To Game TheoryLesson 34: Introduction To Game Theory
Lesson 34: Introduction To Game Theory
 
Integrated 2 Section 2-2
Integrated 2 Section 2-2Integrated 2 Section 2-2
Integrated 2 Section 2-2
 
Review for final exam
Review for final examReview for final exam
Review for final exam
 
11 x1 t10 01 graphing quadratics (2013)
11 x1 t10 01 graphing quadratics (2013)11 x1 t10 01 graphing quadratics (2013)
11 x1 t10 01 graphing quadratics (2013)
 
IE-002 Control Chart For Variables
IE-002 Control Chart For VariablesIE-002 Control Chart For Variables
IE-002 Control Chart For Variables
 
Eigenvalues in a Nutshell
Eigenvalues in a NutshellEigenvalues in a Nutshell
Eigenvalues in a Nutshell
 

Mehr von Matthew Leingang

Streamlining assessment, feedback, and archival with auto-multiple-choice
Streamlining assessment, feedback, and archival with auto-multiple-choiceStreamlining assessment, feedback, and archival with auto-multiple-choice
Streamlining assessment, feedback, and archival with auto-multiple-choiceMatthew Leingang
 
Electronic Grading of Paper Assessments
Electronic Grading of Paper AssessmentsElectronic Grading of Paper Assessments
Electronic Grading of Paper AssessmentsMatthew Leingang
 
Lesson 27: Integration by Substitution (slides)
Lesson 27: Integration by Substitution (slides)Lesson 27: Integration by Substitution (slides)
Lesson 27: Integration by Substitution (slides)Matthew Leingang
 
Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)Matthew Leingang
 
Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)Matthew Leingang
 
Lesson 27: Integration by Substitution (handout)
Lesson 27: Integration by Substitution (handout)Lesson 27: Integration by Substitution (handout)
Lesson 27: Integration by Substitution (handout)Matthew Leingang
 
Lesson 26: The Fundamental Theorem of Calculus (handout)
Lesson 26: The Fundamental Theorem of Calculus (handout)Lesson 26: The Fundamental Theorem of Calculus (handout)
Lesson 26: The Fundamental Theorem of Calculus (handout)Matthew Leingang
 
Lesson 25: Evaluating Definite Integrals (slides)
Lesson 25: Evaluating Definite Integrals (slides)Lesson 25: Evaluating Definite Integrals (slides)
Lesson 25: Evaluating Definite Integrals (slides)Matthew Leingang
 
Lesson 25: Evaluating Definite Integrals (handout)
Lesson 25: Evaluating Definite Integrals (handout)Lesson 25: Evaluating Definite Integrals (handout)
Lesson 25: Evaluating Definite Integrals (handout)Matthew Leingang
 
Lesson 24: Areas and Distances, The Definite Integral (handout)
Lesson 24: Areas and Distances, The Definite Integral (handout)Lesson 24: Areas and Distances, The Definite Integral (handout)
Lesson 24: Areas and Distances, The Definite Integral (handout)Matthew Leingang
 
Lesson 24: Areas and Distances, The Definite Integral (slides)
Lesson 24: Areas and Distances, The Definite Integral (slides)Lesson 24: Areas and Distances, The Definite Integral (slides)
Lesson 24: Areas and Distances, The Definite Integral (slides)Matthew Leingang
 
Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Matthew Leingang
 
Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Matthew Leingang
 
Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)Matthew Leingang
 
Lesson 22: Optimization Problems (handout)
Lesson 22: Optimization Problems (handout)Lesson 22: Optimization Problems (handout)
Lesson 22: Optimization Problems (handout)Matthew Leingang
 
Lesson 21: Curve Sketching (slides)
Lesson 21: Curve Sketching (slides)Lesson 21: Curve Sketching (slides)
Lesson 21: Curve Sketching (slides)Matthew Leingang
 
Lesson 21: Curve Sketching (handout)
Lesson 21: Curve Sketching (handout)Lesson 21: Curve Sketching (handout)
Lesson 21: Curve Sketching (handout)Matthew Leingang
 
Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Matthew Leingang
 
Lesson 20: Derivatives and the Shapes of Curves (handout)
Lesson 20: Derivatives and the Shapes of Curves (handout)Lesson 20: Derivatives and the Shapes of Curves (handout)
Lesson 20: Derivatives and the Shapes of Curves (handout)Matthew Leingang
 

Mehr von Matthew Leingang (20)

Making Lesson Plans
Making Lesson PlansMaking Lesson Plans
Making Lesson Plans
 
Streamlining assessment, feedback, and archival with auto-multiple-choice
Streamlining assessment, feedback, and archival with auto-multiple-choiceStreamlining assessment, feedback, and archival with auto-multiple-choice
Streamlining assessment, feedback, and archival with auto-multiple-choice
 
Electronic Grading of Paper Assessments
Electronic Grading of Paper AssessmentsElectronic Grading of Paper Assessments
Electronic Grading of Paper Assessments
 
Lesson 27: Integration by Substitution (slides)
Lesson 27: Integration by Substitution (slides)Lesson 27: Integration by Substitution (slides)
Lesson 27: Integration by Substitution (slides)
 
Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)
 
Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)
 
Lesson 27: Integration by Substitution (handout)
Lesson 27: Integration by Substitution (handout)Lesson 27: Integration by Substitution (handout)
Lesson 27: Integration by Substitution (handout)
 
Lesson 26: The Fundamental Theorem of Calculus (handout)
Lesson 26: The Fundamental Theorem of Calculus (handout)Lesson 26: The Fundamental Theorem of Calculus (handout)
Lesson 26: The Fundamental Theorem of Calculus (handout)
 
Lesson 25: Evaluating Definite Integrals (slides)
Lesson 25: Evaluating Definite Integrals (slides)Lesson 25: Evaluating Definite Integrals (slides)
Lesson 25: Evaluating Definite Integrals (slides)
 
Lesson 25: Evaluating Definite Integrals (handout)
Lesson 25: Evaluating Definite Integrals (handout)Lesson 25: Evaluating Definite Integrals (handout)
Lesson 25: Evaluating Definite Integrals (handout)
 
Lesson 24: Areas and Distances, The Definite Integral (handout)
Lesson 24: Areas and Distances, The Definite Integral (handout)Lesson 24: Areas and Distances, The Definite Integral (handout)
Lesson 24: Areas and Distances, The Definite Integral (handout)
 
Lesson 24: Areas and Distances, The Definite Integral (slides)
Lesson 24: Areas and Distances, The Definite Integral (slides)Lesson 24: Areas and Distances, The Definite Integral (slides)
Lesson 24: Areas and Distances, The Definite Integral (slides)
 
Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)
 
Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)
 
Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)
 
Lesson 22: Optimization Problems (handout)
Lesson 22: Optimization Problems (handout)Lesson 22: Optimization Problems (handout)
Lesson 22: Optimization Problems (handout)
 
Lesson 21: Curve Sketching (slides)
Lesson 21: Curve Sketching (slides)Lesson 21: Curve Sketching (slides)
Lesson 21: Curve Sketching (slides)
 
Lesson 21: Curve Sketching (handout)
Lesson 21: Curve Sketching (handout)Lesson 21: Curve Sketching (handout)
Lesson 21: Curve Sketching (handout)
 
Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)
 
Lesson 20: Derivatives and the Shapes of Curves (handout)
Lesson 20: Derivatives and the Shapes of Curves (handout)Lesson 20: Derivatives and the Shapes of Curves (handout)
Lesson 20: Derivatives and the Shapes of Curves (handout)
 

Kürzlich hochgeladen

Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Paola De la Torre
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 

Kürzlich hochgeladen (20)

Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 

Review for Midterm I Math 20

  • 1. Review for Midterm I Math 20 October 16, 2007 Announcements Midterm I 10/18, Hall A 7–8:30pm ML Office Hours Wednesday 1–3 (SC 323) SS review session Wednesday 8–9pm (location TBA) Old exams and solutions on website
  • 2. Outline Determinants by sudoku Vector and Matrix Algebra patterns Vectors Determinants by Cofactors Algebra of vectors Systems of Linear Equations Scalar Product Matrices Inversion Addition and scalar Determinants and the multiplication of matrices inverse Inverses of 2 × 2 matrices Matrix Multiplication The Transpose Computing inverses with the Geometry adjoint matrix Lines Computing inverses with row Planes reduction Determinants Properties of the inverse
  • 3. Vector and Matrix Algebra Learning Objectives Add, scale, and compute linear combinations of vectors Compute the scalar product of two vectors Add and scale matrices Multiply matrices all of these with the caveat of “if possible.”
  • 4. Vectors Definition An n-vector (or simply vector) is an ordered list of n numbers. We can write them in a row or in a column.  1 b = (1, 0, −1) 2 a= 3 In linear algebra we mostly work with column vectors.
  • 5. Algebra of vectors Definition Addition of vectors is defined componentwise, as is scalar multiplication:     a1 b1 a1 + b 1  a2   b 2   a2 + b 2   . + . = .      . .  .  . . . an bn an + bn    a1 ta1 a2  ta2  t . = .     .  .  . . an tan
  • 6. Dot product Definition Given two vectors of the same dimension (size), their scalar product (or “dot product”) is the sum of the product of the components of the vectors:    a1 b1 n a2  b2   .  ·  .  = a1 b1 + a2 b2 + · · · + an bn = ai bi    . . . . i=1 an bn
  • 7. Matrices Definition An m × n matrix is a rectangular array of mn numbers arranged in m horizontal rows and n vertical columns.   a11 a12 · · · a1j · · · a1n  a21 a22 · · · a2j · · · a2n    . . . . .. .. . . . . . . . . . . A=  ai1 ai2 · · · aij · · · ain    . . . . .. .. . . . . . . . . . . am1 am2 · · · amj · · · amn
  • 8. Addition and scalar multiplication of matrices Matrices can be added and scaled just like vectors: Example 1 −1 12 21 + = 34 02 36 Example 11 44 4 = −1 2 −4 8
  • 9. The matrix-vector product Definition   v1 v  Let A = (aij ) be an m × n matrix and v =  2  a n-vector . . .  vn (column vector). The matrix-vector product of A and v is the  w1  w2  vector Av =  , where . . . wm n wk = ak1 v1 + ak2 v2 + · · · + akn vn = akj vj , j=1 the dot product of kth row of A with v.
  • 10. Example Let   23 2 A = −1 4 and v= −1 03 Find Av.
  • 11. Example Let   23 2 A = −1 4 and v= −1 03 Find Av. Solution   2 · 2 + 3 · (−1) Av = (−1) · 2 + 4 · (−1) 0 · 2 + 3 · (−1)
  • 12. Example Let   23 2 A = −1 4 and v= −1 03 Find Av. Solution    2 · 2 + 3 · (−1) 4−1 Av = (−1) · 2 + 4 · (−1) = −2 − 4 0 · 2 + 3 · (−1) 0−3
  • 13. Example Let   23 2 A = −1 4 and v= −1 03 Find Av. Solution    2 · 2 + 3 · (−1) 4−1 1 (−1) · 2 + 4 · (−1) = −2 − 4 = −6 . Av = 0 · 2 + 3 · (−1) 0−3 −3
  • 14. Matrix product, defined Definition Let A be an m × n matrix and B a n × p matrix. Then the matrix product of A and B is the m × p matrix whose jth column is Abj . In other words, the (i, j)th entry of AB is the dot product of ith row of A and the jth column of B. In symbols n (AB)ij = aik bkj . k=1 Example     1.5 0.5 1  125 115 110 105   0 0.25 0 ‘70 60 50 40 5 7.5 7.5 7.5      1.5 0.25 0  20 30 30 30 = 100 97.5 82.5 67.5     2 2 3 10 10 20 30 210 210 220 230  3 2 2 270 260 250 240
  • 15. The Transpose There is another operation on matrices, which is just flipping rows and columns. Definition Let A = (aij )m×n be a matrix. The transpose of A is the matrix A = (aij )n×m whose (i, j)th entry is aji .
  • 16. The Transpose There is another operation on matrices, which is just flipping rows and columns. Definition Let A = (aij )m×n be a matrix. The transpose of A is the matrix A = (aij )n×m whose (i, j)th entry is aji . Example  12 Let A = 3 4. Then 56 A=
  • 17. The Transpose There is another operation on matrices, which is just flipping rows and columns. Definition Let A = (aij )m×n be a matrix. The transpose of A is the matrix A = (aij )n×m whose (i, j)th entry is aji . Example  12 Let A = 3 4. Then 56 135 A= . 246
  • 18. The Transpose There is another operation on matrices, which is just flipping rows and columns. Definition Let A = (aij )m×n be a matrix. The transpose of A is the matrix A = (aij )n×m whose (i, j)th entry is aji . Example  12 Let A = 3 4. Then 56 135 A= . 246 Fact Given matrices A and B, of suitable dimensions, (AB) = B A
  • 19. Outline Determinants by sudoku Vector and Matrix Algebra patterns Vectors Determinants by Cofactors Algebra of vectors Systems of Linear Equations Scalar Product Matrices Inversion Addition and scalar Determinants and the multiplication of matrices inverse Inverses of 2 × 2 matrices Matrix Multiplication The Transpose Computing inverses with the Geometry adjoint matrix Lines Computing inverses with row Planes reduction Determinants Properties of the inverse
  • 20. Geometry Learning Objectives Given a point and a line, decide if the point is on the line Given a point and a direction, or two points, find the equation for the line Given a point and a plane, decide of the point is on the plane Given a point and a normal vector, find the equation for the plane
  • 21. Lines Definition The line L through (the head of) a parallel to v is the set of all x such that x = a + tv for some real number t. The line L through (the heads of) a and b is x = a + t(b − a) = (1 − t)a + tb
  • 22. Example Find an equation for the line through (−2, 0, 4) parallel to (2, 4, −2)
  • 23. Example Find an equation for the line through (−2, 0, 4) parallel to (2, 4, −2) Solution x = (−2, 0, 4) + t(2, 4, −2)
  • 24. Example Find an equation for the line through (−3, 2, −3) and (1, −1, 4)
  • 25. Example Find an equation for the line through (−3, 2, −3) and (1, −1, 4) Is (1, 2, 3) on this line?
  • 26. Example Find an equation for the line through (−3, 2, −3) and (1, −1, 4) Is (1, 2, 3) on this line? Answer. No!
  • 27. Planes Definition A hyperplane through (the head of) a that is orthogonal to a vector p is the set of all (heads of) vectors x such that p · (x − a) = 0
  • 28. Example Find an equation for the plane through (−3, 0, 7) perpendicular to (5, 2, −1)
  • 29. Example Find an equation for the plane through (−3, 0, 7) perpendicular to (5, 2, −1) Solution 5x + 2y − z = 22
  • 30. Example Find an equation for the plane through (−3, 0, 7) perpendicular to (5, 2, −1) Solution 5x + 2y − z = 22 Question Is (1, 2, 3) on this plane? Is there a point on the plane with z = 0?
  • 31. Fact If the equation for a plane is Ax + By + Cz = D, then p = (A, B, C ) is normal to the plane.
  • 32. Outline Determinants by sudoku Vector and Matrix Algebra patterns Vectors Determinants by Cofactors Algebra of vectors Systems of Linear Equations Scalar Product Matrices Inversion Addition and scalar Determinants and the multiplication of matrices inverse Inverses of 2 × 2 matrices Matrix Multiplication The Transpose Computing inverses with the Geometry adjoint matrix Lines Computing inverses with row Planes reduction Determinants Properties of the inverse
  • 33. Determinants Learning Objectives Compute determinants of n × n matrices.
  • 34. The determinant Definition a11 a12 The determinant of a 2 × 2 matrix A = is the number a21 a22 a11 a12 = a11 a22 − a21 a12 a21 a22
  • 35. Definition The determinant of a 3 × 3 matrix is a11 a12 a13 a21 a22 a23 = a11 a22 a33 − a11 a23 a32 − a21 a12 a33 a31 a32 a33 + a21 a13 a32 + a31 a12 a23 − a31 a22 a13
  • 37. Sarrus’s Rule a11 a12 a13 a11 a22 a33 + a12 a23 a31 = +a13 a21 a32 − a13 a22 a31 a21 a22 a23 −a11 a23 a32 − a12 a21 a33 a31 a32 a33
  • 38. Sarrus’s Rule a11 a12 a13 a11 a12 a11 a22 a33 + a12 a23 a31 = +a13 a21 a32 − a13 a22 a31 a21 a22 a23 a21 a22 −a11 a23 a32 − a12 a21 a33 a31 a32 a33 a31 a32
  • 39. Sarrus’s Rule a11 a12 a13 a11 a12 a11 a22 a33 + a12 a23 a31 = +a13 a21 a32 − a13 a22 a31 a21 a22 a23 a21 a22 −a11 a23 a32 − a12 a21 a33 a31 a32 a33 a31 a32
  • 40. Sarrus’s Rule a11 a12 a13 a11 a12 a11 a22 a33 + a12 a23 a31 = +a13 a21 a32 − a13 a22 a31 a21 a22 a23 a21 a22 −a11 a23 a32 − a12 a21 a33 a31 a32 a33 a31 a32
  • 41. Sarrus’s Rule a11 a12 a13 a11 a12 a11 a22 a33 + a12 a23 a31 = +a13 a21 a32 − a13 a22 a31 a21 a22 a23 a21 a22 −a11 a23 a32 − a12 a21 a33 a31 a32 a33 a31 a32 This trick does not work for any other determinants!
  • 42. Determinants by sudoku patterns Definition Let A = (aij )n×n be a matrix. The determinant of A is a sum of all products of n elements of the matrix, where each product takes exactly one entry from each row and column.
  • 43. Determinants by sudoku patterns Definition Let A = (aij )n×n be a matrix. The determinant of A is a sum of all products of n elements of the matrix, where each product takes exactly one entry from each row and column. The sign of each product is given by (−1)σ , where σ is the number of upwards lines used when all the entries in a pattern are connected.
  • 44. 4 × 4 sudoku patterns − − − + + + − − − + + + − − − + + + − − − + + +
  • 45. Determinants by Cofactors Definition Let A = (aij )n×n be a matrix. The (i, j)-minor of A is the matrix obtained from A by deleting the ith row and j column. This matrix has dimensions (n − 1) × (n − 1). The (i, j) cofactor of A is the determinant of the (i, j) minor times (−1)i+j .
  • 46. The signs here seem complicated, but they’re not. The number (−1)i+j is 1 if i and j are both even or both odd, and −1 otherwise. They make a checkerboard pattern:   + − + ··· − + − · · ·   + − + · · ·   . . . .. ... . ...
  • 47. Fact The determinant of A = (aij )n×n is the sum a11 C11 + a12 C12 + · · · + a1n C1n
  • 48. Fact The determinant of A = (aij )n×n is the sum a11 C11 + a12 C12 + · · · + a1n C1n Fact The determinant of A = (aij )n×n is the sum a11 Ci1 + ai2 Ci2 + · · · + ain Cin for any i.
  • 49. Fact The determinant of A = (aij )n×n is the sum a11 C11 + a12 C12 + · · · + a1n C1n Fact The determinant of A = (aij )n×n is the sum a11 Ci1 + ai2 Ci2 + · · · + ain Cin for any i. Fact The determinant of A = (aij )n×n is the sum a1j C1j + a2j C2j + · · · + anj Cnj for any j.
  • 50. Theorem (Rules for Determinants) Let A be an n × n matrix. 1. If a row or column of A is full of zeros, then |A| = 0. 2. |A | = |A| 3. If B is the matrix obtained by multiplying each entry of one row or column of A by the same number α, then |B| = α |A|. 4. If two rows or columns of A are interchanged, then the determinant changes its sign but keeps its absolute value. 5. If a row or column of A is duplicated, then |A| = 0.
  • 51. Theorem (Rules for Determinants, continued) Let A be an n × n matrix. 5. If a row or column of A is proportional to another, then |A| = 0. 6. If a scalar multiple of one row (or column) of A is added to another row (or column), then the determinant does not change. 7. The determinant of the product of two matrices is the product of the determinants of those matrices: |AB| = |A| |B| 8. if α is any real number, then |αA| = αn |A|.
  • 52. Outline Determinants by sudoku Vector and Matrix Algebra patterns Vectors Determinants by Cofactors Algebra of vectors Systems of Linear Equations Scalar Product Matrices Inversion Addition and scalar Determinants and the multiplication of matrices inverse Inverses of 2 × 2 matrices Matrix Multiplication The Transpose Computing inverses with the Geometry adjoint matrix Lines Computing inverses with row Planes reduction Determinants Properties of the inverse
  • 53. Systems of Linear Equations Learning Objectives Solve a System of Linear Equations using Gaussian Elimination Find the (R)REF of a matrix.
  • 54. Row Operations The operations on systems of linear equations are reflected in the augmented matrix. 1. Transposing (switching) rows in an augmented matrix does not change the solution. 2. Scaling any row in an augmented matrix does not change the solution. 3. Adding to any row in an augmented matrix any multiple of any other row in the matrix does not change the solution.
  • 55. Gaussian Elimination 1. Locate the first nonzero column. This is pivot column, and the top row in this column is called a pivot position. Transpose rows to make sure this position has a nonzero entry. If you like, scale the row to make this position equal to one. 2. Use row operations to make all entries below the pivot position zero. 3. Repeat Steps 1 and 2 on the submatrix below the first row and to the right of the first column. Finally, you will arrive at a matrix in row echelon form. (up to here is called the forward pass) 4. Scale the bottom row to make the leading entry one. 5. Use row operations to make all entries above this entry zero. 6. Repeat Steps 4 and 5 on the submatrix formed above and to the left of this entry. (These steps are called the backward pass)
  • 56. Example Solve the system of linear equations 2x+ 8y +4z =2 2x+ 5y + z =5 4x+10y −4z =1
  • 57. Outline Determinants by sudoku Vector and Matrix Algebra patterns Vectors Determinants by Cofactors Algebra of vectors Systems of Linear Equations Scalar Product Matrices Inversion Addition and scalar Determinants and the multiplication of matrices inverse Inverses of 2 × 2 matrices Matrix Multiplication The Transpose Computing inverses with the Geometry adjoint matrix Lines Computing inverses with row Planes reduction Determinants Properties of the inverse
  • 58. Inversion Learning Objectives Use the determinant to determine whether a matrix is invertible Find the inverse of a matrix
  • 59. Determinants and the inverse If A has an inverse A−1 , what is |A|−1 ? Answer. |A| A−1 = AA−1 = |I| = 1 Fact A is invertible if and only if |A| = 0.
  • 60. Inverses of 2 × 2 matrices ab Let A = . This is small enough that we can explicitly solve cd for A−1 . Fact If ad − bc = 0, then −1 d −b 1 ab = . −c a cd ad − bc
  • 61. Cofactors Remember how we computed determinants: Given A, Cij was (−1)i+j times the determinant of Aij , which was A with the ith row and j column deleted. Let C+ = (Cij ). Then A(C+ ) = |A| I So 1 A−1 = (C+ ) |A| We denote by adj A the matrix (C+ ) , the adjoint matrix of A.
  • 62. Computing inverses with row reduction To find the inverse of A, form the augmented matrix A I and row reduce. If the RREF has the form I B then B = A−1 . Otherwise, A is not invertible.
  • 63. Example   0 2 −3 Find the inverse of A = −2 1 2  using row reduction. 201 0 2 −3 1 0 0 ← −    20 1001  −2 1  −2 1 2 0 1 0 ← + − 2 0 1 0 0 2 −3 1 0 0 20 1001 ← −   20 1001 0 1 3 0 1 1  −2 0 2 −3 1 0 0 ← +−   20 10 0 1 0 1 30 1 1 0 0 −9 1 −2 −2 ×−1/9
  • 64. 1 ← − −+  −−  201 0 0 1 ← + − 0 1 3 0 1 0 0 1 −1/9 2/9 2/9 −3 −1  ×1/2  −2/9 2 00 1/9 7/9 0 10 1/3 1/3 1/3  0 1 −1/9 0 2/9 2/9   0 0 1/18 −2/18 1 7/18 0 10 1/3 1/3 1/3  0 1 −1/9 0 2/9 2/9 Notice we get the same matrix as with the adjoint method.
  • 65. Properties of the inverse Theorem (Properties of the Inverse) Let A and B be invertible n × n matrices. Then (a) (A−1 )−1 = A (b) (AB)−1 = B−1 A−1 (c) (A )−1 = (A−1 ) (d) If c is any nonzero number, (cA)−1 = c −1 A−1 .