SlideShare ist ein Scribd-Unternehmen logo
1 von 8
Downloaden Sie, um offline zu lesen
SECTION 4.3

LINEAR COMBINATIONS AND
INDEPENDENCE OF VECTORS
In this section we use two types of computational problems as aids in understanding linear
independence and dependence. The first of these problems is that of expressing a vector w as a
linear combination of k given vectors v1 , v 2 ,  , v k (if possible). The second is that of
determining whether k given vectors v1 , v 2 ,  , v k are linearly independent. For vectors in Rn,
each of these problems reduces to solving a linear system of n equations in k unknowns. Thus an
abstract question of linear independence or dependence becomes a concrete question of whether or
not a given linear system has a nontrivial solution.

1.      v2 =   3
               2
                   v1 , so the two vectors v1 and v2 are linearly dependent.

2.     Evidently the two vectors v1 and v2 are not scalar multiples of one another. Hence they
       are linearly dependent.

3.     The three vectors v1, v2, and v3 are linearly dependent, as are any 3 vectors in R2. The
       reason is that the vector equation c1v1 + c2v2 + c3v3 = 0 reduces to a homogeneous linear
       system of 2 equations in the 3 unknowns c1 , c2 , and c3 , and any such system has a
       nontrivial solution.

4.     The four vectors v1, v2, v3, and v4 are linearly dependent, as are any 4 vectors in R3. The
       reason is that the vector equation c1v1 + c2v2 + c3v3 + c4v4 = 0 reduces to a homogeneous
       linear system of 3 equations in the 4 unknowns c1 , c2 , c3 , and c4 , and any such system
       has a nontrivial solution.

5.     The equation c1 v1 + c2 v 2 + c3 v 3 = 0 yields

                   c1 (1, 0, 0) + c2 (0, −2, 0) + c3 (0, 0,3) = (c1 , −2c2 ,3c3 ) = (0, 0, 0),

       and therefore implies immediately that c1 = c2 = c3 = 0. Hence the given vectors
       v1, v2, and v3 are linearly independent.

6.     The equation c1 v1 + c2 v 2 + c3 v 3 = 0 yields

                   c1 (1, 0, 0) + c2 (1,1, 0) + c3 (1,1,1) = (c1 + c2 + c3 , c2 + c3 , c3 ) = (0, 0, 0).

       But it is obvious by back-substitution that the homogeneous system
c1 + c2 + c3 = 0
                                                c2 + c3 = 0
                                                      c3 = 0

       has only the trivial solution c1 = c2 = c3 = 0. Hence the given vectors
       v1, v2, and v3 are linearly independent.

7.     The equation c1 v1 + c2 v 2 + c3 v 3 = 0 yields

               c1 (2,1, 0, 0) + c2 (3, 0,1, 0) + c3 (4, 0, 0,1) = (2c1 + 3c2 , c1 , c2 , c3 ) = (0, 0, 0, 0).

       Obviously it follows immediately that c1 = c2 = c3 = 0. Hence the given vectors
       v1, v2, and v3 are linearly independent.

8.     Here inspection of the three given vectors reveals that v 3 = v1 + v 2 , so the vectors
       v1, v2, and v3 are linearly dependent.

In Problems 9-16 we first set up the linear system to be solved for the linear combination
coefficients {ci }, and then show the reduction of its augmented coefficient matrix A to reduced
echelon form E.

9.     c1 v1 + c2 v 2 = w

            5 3 1     1 0 2 
        A =  3 2 0  → 0 1 −3 = E
                            
             4 5 −7 
                      0 0 0 
                              
       We see that the system of 3 equations in 2 unknowns has the unique solution
       c1 = 2, c2 = −3, so w = 2 v1 − 3v 2 .

10.    c1 v1 + c2 v 2 = w

             −3 6 3    1 0 7 
             1 −2 −1 → 0 1 4  = E
        A =                  
             −2 3 −2 
                       0 0 0 
                               
       We see that the system of 3 equations in 2 unknowns has the unique solution
       c1 = 7, c2 = 4, so w = 7 v1 + 4 v 2 .
11.   c1 v1 + c2 v 2 = w

          7 3 1       1 0 1 
           −6 −3 0            
      A =           →  0 1 −2  = E
          4 2 0       0 0 0 
                              
           5 3 −1     0 0 0 
      We see that the system of 4 equations in 2 unknowns has the unique solution
      c1 = 1, c2 = −2, so w = v1 − 2 v 2 .

12.   c1 v1 + c2 v 2 = w

           7 −2 4     1            0   2
           3 −2 −4                     5
      A =           → 0            1     = E
           −1 1  3    0            0   0
                                        
           9 −3 3     0            0   0
      We see that the system of 4 equations in 2 unknowns has the unique solution
      c1 = 2, c2 = 5, so w = 2 v1 + 5v 2 .

13.   c1 v1 + c2 v 2 = w

          1 5 5      1 0 0
      A =  5 −3 2  → 0 1 0  = E
                           
           −3 4 −2 
                     0 0 1 
                             
      The last row of E corresponds to the scalar equation 0c1 + 0c2 = 1, so the system of 3
      equations in 2 unknowns is inconsistent. This means that w cannot be expressed as a
      linear combination of v1 and v2.

14.   c1v1 + c2 v 2 + c3 v 3 = w

          1 0 0 2        1             0   0   0
           0 1 − 1 − 3                         0
      A =              → 0             1   0     = E
          0 −2 1 2       0             0   1   0
                                                
           3 0 1 −3      0             0   0   1

      The last row of E corresponds to the scalar equation 0c1 + 0c2 + 0c3 = 1, so the system
      of 4 equations in 3 unknowns is inconsistent. This means that w cannot be expressed as
      a linear combination of v1, v2, and v3.
15.     c1v1 + c2 v 2 + c3 v 3 = w

             2 3 1 4     1 0 0 3 
             −1 0 2 5  → 0 1 0 −2  = E
        A =                       
             4 1 −1 6 
                         0 0 1 4 
                                    
       We see that the system of 3 equations in 3 unknowns has the unique solution
       c1 = 3, c2 = −2, c3 = 4, so w = 3v1 − 2 v 2 + 4 v 3 .

16.     c1v1 + c2 v 2 + c3 v 3 = w

            2       4 1 7     1               0   0 6
            0       1 3 7    0               1   0 −2 
        A =                  →                           = E
            3       3 −1 9    0               0   1 3
                                                       
            1       2 3 11    0               0   0 0
       We see that the system of 4 equations in 3 unknowns has the unique solution
       c1 = 6, c2 = −2, c3 = 3, so w = 6 v1 − 2 v 2 + 3v 3 .

In Problems 17-22, A = [ v1          v2   v 3 ] is the coefficient matrix of the homogeneous linear
system corresponding to the vector equation c1v1 + c2 v 2 + c3 v 3 = 0. Inspection of the indicated
reduced echelon form E of A then reveals whether or not a nontrivial solution exists.

            1 2 3     1 0 0 
17.         0 −3 5  → 0 1 0  = E
        A =                 
            1 4 2 
                      0 0 1 
                              
       We see that the system of 3 equations in 3 unknowns has the unique solution
       c1 = c2 = c3 = 0, so the vectors v1 , v 2 , v 3 are linearly independent.

             2 4 −2     1 0 −3 / 5 
18.          0 −5 1  → 0 1 −1/ 5  = E
        A =                        
             −3 −6 3 
                       0 0
                               0    
       We see that the system of 3 equations in 3 unknowns has a 1-dimensional solution space.
       If we choose c3 = 5 then c1 = 3 and c2 = 1. Therefore 3v1 + v 2 + 5 v3 = 0.

            2 5 2      1                  0   0
             0 4 −1                           0
19.     A =          → 0                  1     = E
             3 −2 1    0                  0   1
                                               
             0 1 −1    0                  0   0
We see that the system of 4 equations in 3 unknowns has the unique solution
      c1 = c2 = c3 = 0, so the vectors v1 , v 2 , v 3 are linearly independent.

          1       2    3   1         0   0
          1       1     
                        1   0         1   0
20.   A =                 →                 = E
           −1     1    4   0         0   1
                                          
          1       1    1   0         0   0
      We see that the system of 4 equations in 3 unknowns has the unique solution
      c1 = c2 = c3 = 0, so the vectors v1 , v 2 , v 3 are linearly independent.

          3 1          1   1         0 1
           0 −1         
                        2   0         1 −2 
21.   A =                 →                 = E
          1 0          1   0         0 0
                                          
          2 1          0   0         0 0
      We see that the system of 4 equations in 3 unknowns has a 1-dimensional solution space.
      If we choose c3 = −1 then c1 = 1 and c2 = −2. Therefore v1 − 2 v 2 − v 3 = 0.

          3 3          5   1         0 7 / 9
          9 0           
                        7   0         1 5 / 9
22.   A =                 →                   = E
          0 9          5   0         0 0 
                                            
          5 −7         0   0         0 0 
      We see that the system of 4 equations in 3 unknowns has a 1-dimensional solution space.
      If we choose c3 = −9 then c1 = 7 and c2 = 5. Therefore 7 v1 + 5 v 2 − 9 v3 = 0.

23.   Because v1 and v2 are linearly independent, the vector equation

             c1u1 + c2u 2 = c1 ( v1 + v 2 ) + c2 ( v1 − v 2 ) = 0

      yields the homogeneous linear system

                               c1 + c2 = 0
                               c1 − c2 = 0.

      It follows readily that c1 = c2 = 0, and therefore that the vectors u1 and u2 are linearly
      independent.

24.   Because v1 and v2 are linearly independent, the vector equation

             c1u1 + c2u 2 = c1 ( v1 + v 2 ) + c2 (2 v1 + 3v 2 ) = 0
yields the homogeneous linear system

                                 c1 + 2c2 = 0
                                 c1 + 3c2 = 0.

      Subtraction of the first equation from the second one gives c2 = 0, and then it follows
      from the first equation that c2 = 0 also. Therefore the vectors u1 and u2 are linearly
      independent.

25.   Because the vectors v1 , v 2 , v 3 are linearly independent, the vector equation

              c1u1 + c2u 2 + c3u 3 = c1 ( v1 ) + c2 ( v1 + 2 v 2 ) + c3 ( v1 + 2 v 2 + 3v 3 ) = 0

      yields the homogeneous linear system

                                 c1 + c2 + c3 = 0
                                      2c2 + 2c3 = 0
                                             3c3 = 0.

      It follows by back-substitution that c1 = c2 = c3 = 0, and therefore that the vectors
      u1 , u 2 , u3 are linearly independent.

26.   Because the vectors v1 , v 2 , v 3 are linearly independent, the vector equation

              c1u1 + c2u 2 + c3u 3 = c1 ( v 2 + v 3 ) + c2 ( v1 + v 3 ) + c3 ( v1 + v 2 ) = 0

      yields the homogeneous linear system

                                      c2 + c3 = 0
                                 c1        + c3 = 0
                                 c1 + c2       = 0.
      The reduction
                           0 1 1    1 0 0 
                           1 0 1  →  0 1 0  = E
                       A =                 
                           1 1 0 
                                    0 0 1 
                                             

       then shows that c1 = c2 = c3 = 0, and therefore that the vectors u1 , u 2 , u3 are linearly
      independent.
27.   If the elements of S are v1 , v 2 ,  , v k with v1 = 0, then we can take c1 = 1 and
      c2 =  = ck = 0. This choice gives coefficients c1 , c2 ,  , ck not all zero such that
      c1v1 + c2 v 2 +  + ck v k = 0. This means that the vectors v1 , v 2 ,  , v k are linearly
      dependent.

28.   Because the set S of vectors v1 , v 2 ,  , v k is linearly dependent, there exist scalars
       c1 , c2 ,  , ck not all zero such that c1v1 + c2 v 2 +  + ck v k = 0. If ck +1 =  = cm = 0,
      then c1 v1 + c2 v 2 +  + cm v m = 0 with the coefficients c1 , c2 ,  , cm not all zero. This
      means that the vectors v1 , v 2 ,  , v m comprising T are linearly dependent.

29.   If some subset of S were linearly dependent, then Problem 28 would imply immediately
      that S itself is linearly dependent (contrary to hypothesis).

30.   Let W be the subspace of V spanned by the vectors v1 , v 2 ,  , v k . Because U is a
      subspace containing each of these vectors, it contains every linear combination of
      v1 , v 2 ,  , v k . But W consists solely of such linear combinations, so it follows that U
      contains W.

31.   If S is contained in span(T), then every vector in S is a linear combination of vectors in
      T. Hence every vector in span(S) is a linear combination of linear combinations of
      vectors in T. Therefore every vector in span(S) is a linear combination of vectors in T,
      and therefore is itself in span(T). Thus span(S) is a subset of span(T).

32.   If u is another vector in S then the k+1 vectors v1 , v 2 ,  , v k , u are linearly
      dependent. Hence there exist scalars c1 , c2 ,  , ck , c not all zero such that
      c1v1 + c2 v 2 +  + ck v k + cu = 0. If c = 0 then we have a contradiction to the
      hypothesis that the vectors v1 , v 2 ,  , v k are linearly independent. Therefore c ≠ 0,
      so we can solve for u as a linear combination of the vectors v1 , v 2 ,  , v k .

33.   The determinant of the k × k identity matrix is nonzero, so it follows immediately from
      Theorem 3 in this section that the vectors v1 , v 2 ,  , v k are linearly independent.

34.   If the vectors v1 , v 2 ,  , v n are linearly independent, then by Theorem 2 the matrix
       A = [ v1 v 2  v n ] is nonsingular. If B is another nonsingular n × n matrix, then
      the product AB is also nonsingular, and therefore (by Theorem 2) has linearly
      independent column vectors.

35.   Because the vectors v1 , v 2 ,  , v k are linearly independent, Theorem 3 implies that some
      k × k submatrix A0 of A has nonzero determinant. Let A0 consist of the rows
      i1 , i2 ,  , ik of the matrix A, and let C0 denote the k × k submatrix consisting of the
same rows of the product matrix C = AB. Then C0 = A0B, so C0 = A 0 B ≠ 0
because (by hypothesis) the k × k matrix B is also nonsingular. Therefore Theorem 3
implies that the column vectors of AB are linearly independent.

Weitere ähnliche Inhalte

Was ist angesagt?

Linear equation in 2 variable class 10
Linear equation in 2 variable class 10Linear equation in 2 variable class 10
Linear equation in 2 variable class 10AadhiSXA
 
Elementary Differential Equations 11th Edition Boyce Solutions Manual
Elementary Differential Equations 11th Edition Boyce Solutions ManualElementary Differential Equations 11th Edition Boyce Solutions Manual
Elementary Differential Equations 11th Edition Boyce Solutions ManualMiriamKennedys
 
pensamiento geométrico y analítico
pensamiento geométrico y analíticopensamiento geométrico y analítico
pensamiento geométrico y analíticoFLORMARINAARIZARAMIR
 
Conceptual Short Tricks for JEE(Main and Advanced)
Conceptual Short Tricks for JEE(Main and Advanced)Conceptual Short Tricks for JEE(Main and Advanced)
Conceptual Short Tricks for JEE(Main and Advanced)Pony Joglekar
 
Magical Short Tricks for JEE(Main).
Magical Short Tricks for JEE(Main).Magical Short Tricks for JEE(Main).
Magical Short Tricks for JEE(Main).Vijay Joglekar
 
Systems of linear equations
Systems of linear equationsSystems of linear equations
Systems of linear equationsgandhinagar
 
NÚMEROS REALES Y PLANO NUMÉRICO
NÚMEROS REALES Y PLANO NUMÉRICONÚMEROS REALES Y PLANO NUMÉRICO
NÚMEROS REALES Y PLANO NUMÉRICOAngeliPeaSuarez
 
Gmat quant topic 5 geometry solutions
Gmat quant topic 5   geometry solutionsGmat quant topic 5   geometry solutions
Gmat quant topic 5 geometry solutionsRushabh Vora
 
Test yourself for JEE(Main)TP-4
Test yourself for JEE(Main)TP-4Test yourself for JEE(Main)TP-4
Test yourself for JEE(Main)TP-4Vijay Joglekar
 
Unidad ii matematica (autoguardado)
Unidad ii matematica (autoguardado)Unidad ii matematica (autoguardado)
Unidad ii matematica (autoguardado)JeancarlosFreitez
 
Ultimate guide to systems of equations
Ultimate guide to systems of equationsUltimate guide to systems of equations
Ultimate guide to systems of equationskhyps13
 
Test yourself for JEE(Main)TP-5
Test yourself for JEE(Main)TP-5Test yourself for JEE(Main)TP-5
Test yourself for JEE(Main)TP-5Vijay Joglekar
 

Was ist angesagt? (20)

Linear equation in 2 variable class 10
Linear equation in 2 variable class 10Linear equation in 2 variable class 10
Linear equation in 2 variable class 10
 
Sect5 1
Sect5 1Sect5 1
Sect5 1
 
Elementary Differential Equations 11th Edition Boyce Solutions Manual
Elementary Differential Equations 11th Edition Boyce Solutions ManualElementary Differential Equations 11th Edition Boyce Solutions Manual
Elementary Differential Equations 11th Edition Boyce Solutions Manual
 
pensamiento geométrico y analítico
pensamiento geométrico y analíticopensamiento geométrico y analítico
pensamiento geométrico y analítico
 
Sect4 1
Sect4 1Sect4 1
Sect4 1
 
Sect4 2
Sect4 2Sect4 2
Sect4 2
 
Conceptual Short Tricks for JEE(Main and Advanced)
Conceptual Short Tricks for JEE(Main and Advanced)Conceptual Short Tricks for JEE(Main and Advanced)
Conceptual Short Tricks for JEE(Main and Advanced)
 
Word Problems
Word ProblemsWord Problems
Word Problems
 
Sistema de ecuaciones
Sistema de ecuacionesSistema de ecuaciones
Sistema de ecuaciones
 
Magical Short Tricks for JEE(Main).
Magical Short Tricks for JEE(Main).Magical Short Tricks for JEE(Main).
Magical Short Tricks for JEE(Main).
 
Systems of linear equations
Systems of linear equationsSystems of linear equations
Systems of linear equations
 
AMU - Mathematics - 2004
AMU - Mathematics  - 2004AMU - Mathematics  - 2004
AMU - Mathematics - 2004
 
NÚMEROS REALES Y PLANO NUMÉRICO
NÚMEROS REALES Y PLANO NUMÉRICONÚMEROS REALES Y PLANO NUMÉRICO
NÚMEROS REALES Y PLANO NUMÉRICO
 
Gmat quant topic 5 geometry solutions
Gmat quant topic 5   geometry solutionsGmat quant topic 5   geometry solutions
Gmat quant topic 5 geometry solutions
 
Test yourself for JEE(Main)TP-4
Test yourself for JEE(Main)TP-4Test yourself for JEE(Main)TP-4
Test yourself for JEE(Main)TP-4
 
Unidad ii matematica (autoguardado)
Unidad ii matematica (autoguardado)Unidad ii matematica (autoguardado)
Unidad ii matematica (autoguardado)
 
Quiz
QuizQuiz
Quiz
 
Ultimate guide to systems of equations
Ultimate guide to systems of equationsUltimate guide to systems of equations
Ultimate guide to systems of equations
 
Binomial
BinomialBinomial
Binomial
 
Test yourself for JEE(Main)TP-5
Test yourself for JEE(Main)TP-5Test yourself for JEE(Main)TP-5
Test yourself for JEE(Main)TP-5
 

Ähnlich wie Linear combinations and independence

linearequns-classx-180912070018.pdf
linearequns-classx-180912070018.pdflinearequns-classx-180912070018.pdf
linearequns-classx-180912070018.pdfMayankYadav777500
 
CLASS X MATHS LINEAR EQUATIONS
CLASS X MATHS LINEAR EQUATIONSCLASS X MATHS LINEAR EQUATIONS
CLASS X MATHS LINEAR EQUATIONSRc Os
 
algebra lesson notes (best).pdf
algebra lesson notes (best).pdfalgebra lesson notes (best).pdf
algebra lesson notes (best).pdfCyprianObota
 
ALA Solution.pdf
ALA Solution.pdfALA Solution.pdf
ALA Solution.pdfRkAA4
 
PAIR OF LINEAR EQUATION IN TWO VARIABLE
PAIR OF LINEAR EQUATION IN TWO VARIABLEPAIR OF LINEAR EQUATION IN TWO VARIABLE
PAIR OF LINEAR EQUATION IN TWO VARIABLENaveen R
 
Prin digcommselectedsoln
Prin digcommselectedsolnPrin digcommselectedsoln
Prin digcommselectedsolnAhmed Alshomi
 
Linear equations in two variables
Linear equations in two variablesLinear equations in two variables
Linear equations in two variablesvijayapatil27
 
UPSEE - Mathematics -2006 Unsolved Paper
UPSEE - Mathematics -2006 Unsolved PaperUPSEE - Mathematics -2006 Unsolved Paper
UPSEE - Mathematics -2006 Unsolved PaperVasista Vinuthan
 
Linear equations in two variables
Linear equations in two variablesLinear equations in two variables
Linear equations in two variablesVivekNaithani3
 
Advanced Engineering Mathematics Solutions Manual.pdf
Advanced Engineering Mathematics Solutions Manual.pdfAdvanced Engineering Mathematics Solutions Manual.pdf
Advanced Engineering Mathematics Solutions Manual.pdfWhitney Anderson
 
C2 st lecture 2 handout
C2 st lecture 2 handoutC2 st lecture 2 handout
C2 st lecture 2 handoutfatima d
 
Module 1 plane coordinate geometry
Module 1   plane coordinate geometryModule 1   plane coordinate geometry
Module 1 plane coordinate geometrydionesioable
 
Solucao_Marion_Thornton_Dinamica_Classic (1).pdf
Solucao_Marion_Thornton_Dinamica_Classic (1).pdfSolucao_Marion_Thornton_Dinamica_Classic (1).pdf
Solucao_Marion_Thornton_Dinamica_Classic (1).pdfFranciscoJavierCaedo
 
Pair of linear equations in two variable
Pair of linear equations in two variablePair of linear equations in two variable
Pair of linear equations in two variableBuddhimaan Chanakya
 

Ähnlich wie Linear combinations and independence (20)

linearequns-classx-180912070018.pdf
linearequns-classx-180912070018.pdflinearequns-classx-180912070018.pdf
linearequns-classx-180912070018.pdf
 
CLASS X MATHS LINEAR EQUATIONS
CLASS X MATHS LINEAR EQUATIONSCLASS X MATHS LINEAR EQUATIONS
CLASS X MATHS LINEAR EQUATIONS
 
Mathematics ppt.pptx
Mathematics ppt.pptxMathematics ppt.pptx
Mathematics ppt.pptx
 
Ch03 3
Ch03 3Ch03 3
Ch03 3
 
algebra lesson notes (best).pdf
algebra lesson notes (best).pdfalgebra lesson notes (best).pdf
algebra lesson notes (best).pdf
 
ALA Solution.pdf
ALA Solution.pdfALA Solution.pdf
ALA Solution.pdf
 
PAIR OF LINEAR EQUATION IN TWO VARIABLE
PAIR OF LINEAR EQUATION IN TWO VARIABLEPAIR OF LINEAR EQUATION IN TWO VARIABLE
PAIR OF LINEAR EQUATION IN TWO VARIABLE
 
Linear Algebra.pptx
Linear Algebra.pptxLinear Algebra.pptx
Linear Algebra.pptx
 
Prin digcommselectedsoln
Prin digcommselectedsolnPrin digcommselectedsoln
Prin digcommselectedsoln
 
Linear equations in two variables
Linear equations in two variablesLinear equations in two variables
Linear equations in two variables
 
UPSEE - Mathematics -2006 Unsolved Paper
UPSEE - Mathematics -2006 Unsolved PaperUPSEE - Mathematics -2006 Unsolved Paper
UPSEE - Mathematics -2006 Unsolved Paper
 
SMT1105-1.pdf
SMT1105-1.pdfSMT1105-1.pdf
SMT1105-1.pdf
 
Linear equations in two variables
Linear equations in two variablesLinear equations in two variables
Linear equations in two variables
 
Conformal mapping
Conformal mappingConformal mapping
Conformal mapping
 
Advanced Engineering Mathematics Solutions Manual.pdf
Advanced Engineering Mathematics Solutions Manual.pdfAdvanced Engineering Mathematics Solutions Manual.pdf
Advanced Engineering Mathematics Solutions Manual.pdf
 
C2 st lecture 2 handout
C2 st lecture 2 handoutC2 st lecture 2 handout
C2 st lecture 2 handout
 
Module 1 plane coordinate geometry
Module 1   plane coordinate geometryModule 1   plane coordinate geometry
Module 1 plane coordinate geometry
 
Solucao_Marion_Thornton_Dinamica_Classic (1).pdf
Solucao_Marion_Thornton_Dinamica_Classic (1).pdfSolucao_Marion_Thornton_Dinamica_Classic (1).pdf
Solucao_Marion_Thornton_Dinamica_Classic (1).pdf
 
Pair of linear equations in two variable
Pair of linear equations in two variablePair of linear equations in two variable
Pair of linear equations in two variable
 
Aieee Maths 2004
Aieee Maths  2004Aieee Maths  2004
Aieee Maths 2004
 

Mehr von inKFUPM (20)

Tb10
Tb10Tb10
Tb10
 
Tb18
Tb18Tb18
Tb18
 
Tb14
Tb14Tb14
Tb14
 
Tb13
Tb13Tb13
Tb13
 
Tb17
Tb17Tb17
Tb17
 
Tb16
Tb16Tb16
Tb16
 
Tb15
Tb15Tb15
Tb15
 
Tb12
Tb12Tb12
Tb12
 
Tb11
Tb11Tb11
Tb11
 
Tb09
Tb09Tb09
Tb09
 
Tb05
Tb05Tb05
Tb05
 
Tb07
Tb07Tb07
Tb07
 
Tb04
Tb04Tb04
Tb04
 
Tb02
Tb02Tb02
Tb02
 
Tb03
Tb03Tb03
Tb03
 
Tb06
Tb06Tb06
Tb06
 
Tb01
Tb01Tb01
Tb01
 
Tb08
Tb08Tb08
Tb08
 
21221
2122121221
21221
 
Sect5 6
Sect5 6Sect5 6
Sect5 6
 

Kürzlich hochgeladen

(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...AliaaTarek5
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...Wes McKinney
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersRaghuram Pandurangan
 
Connecting the Dots for Information Discovery.pdf
Connecting the Dots for Information Discovery.pdfConnecting the Dots for Information Discovery.pdf
Connecting the Dots for Information Discovery.pdfNeo4j
 
Manual 508 Accessibility Compliance Audit
Manual 508 Accessibility Compliance AuditManual 508 Accessibility Compliance Audit
Manual 508 Accessibility Compliance AuditSkynet Technologies
 
A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersNicole Novielli
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsNathaniel Shimoni
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfpanagenda
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch TuesdayIvanti
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Strongerpanagenda
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rick Flair
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPathCommunity
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 

Kürzlich hochgeladen (20)

(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information Developers
 
Connecting the Dots for Information Discovery.pdf
Connecting the Dots for Information Discovery.pdfConnecting the Dots for Information Discovery.pdf
Connecting the Dots for Information Discovery.pdf
 
Manual 508 Accessibility Compliance Audit
Manual 508 Accessibility Compliance AuditManual 508 Accessibility Compliance Audit
Manual 508 Accessibility Compliance Audit
 
A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software Developers
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directions
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch Tuesday
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to Hero
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 

Linear combinations and independence

  • 1. SECTION 4.3 LINEAR COMBINATIONS AND INDEPENDENCE OF VECTORS In this section we use two types of computational problems as aids in understanding linear independence and dependence. The first of these problems is that of expressing a vector w as a linear combination of k given vectors v1 , v 2 , , v k (if possible). The second is that of determining whether k given vectors v1 , v 2 , , v k are linearly independent. For vectors in Rn, each of these problems reduces to solving a linear system of n equations in k unknowns. Thus an abstract question of linear independence or dependence becomes a concrete question of whether or not a given linear system has a nontrivial solution. 1. v2 = 3 2 v1 , so the two vectors v1 and v2 are linearly dependent. 2. Evidently the two vectors v1 and v2 are not scalar multiples of one another. Hence they are linearly dependent. 3. The three vectors v1, v2, and v3 are linearly dependent, as are any 3 vectors in R2. The reason is that the vector equation c1v1 + c2v2 + c3v3 = 0 reduces to a homogeneous linear system of 2 equations in the 3 unknowns c1 , c2 , and c3 , and any such system has a nontrivial solution. 4. The four vectors v1, v2, v3, and v4 are linearly dependent, as are any 4 vectors in R3. The reason is that the vector equation c1v1 + c2v2 + c3v3 + c4v4 = 0 reduces to a homogeneous linear system of 3 equations in the 4 unknowns c1 , c2 , c3 , and c4 , and any such system has a nontrivial solution. 5. The equation c1 v1 + c2 v 2 + c3 v 3 = 0 yields c1 (1, 0, 0) + c2 (0, −2, 0) + c3 (0, 0,3) = (c1 , −2c2 ,3c3 ) = (0, 0, 0), and therefore implies immediately that c1 = c2 = c3 = 0. Hence the given vectors v1, v2, and v3 are linearly independent. 6. The equation c1 v1 + c2 v 2 + c3 v 3 = 0 yields c1 (1, 0, 0) + c2 (1,1, 0) + c3 (1,1,1) = (c1 + c2 + c3 , c2 + c3 , c3 ) = (0, 0, 0). But it is obvious by back-substitution that the homogeneous system
  • 2. c1 + c2 + c3 = 0 c2 + c3 = 0 c3 = 0 has only the trivial solution c1 = c2 = c3 = 0. Hence the given vectors v1, v2, and v3 are linearly independent. 7. The equation c1 v1 + c2 v 2 + c3 v 3 = 0 yields c1 (2,1, 0, 0) + c2 (3, 0,1, 0) + c3 (4, 0, 0,1) = (2c1 + 3c2 , c1 , c2 , c3 ) = (0, 0, 0, 0). Obviously it follows immediately that c1 = c2 = c3 = 0. Hence the given vectors v1, v2, and v3 are linearly independent. 8. Here inspection of the three given vectors reveals that v 3 = v1 + v 2 , so the vectors v1, v2, and v3 are linearly dependent. In Problems 9-16 we first set up the linear system to be solved for the linear combination coefficients {ci }, and then show the reduction of its augmented coefficient matrix A to reduced echelon form E. 9. c1 v1 + c2 v 2 = w 5 3 1  1 0 2  A =  3 2 0  → 0 1 −3 = E      4 5 −7    0 0 0    We see that the system of 3 equations in 2 unknowns has the unique solution c1 = 2, c2 = −3, so w = 2 v1 − 3v 2 . 10. c1 v1 + c2 v 2 = w  −3 6 3  1 0 7   1 −2 −1 → 0 1 4  = E A =      −2 3 −2    0 0 0    We see that the system of 3 equations in 2 unknowns has the unique solution c1 = 7, c2 = 4, so w = 7 v1 + 4 v 2 .
  • 3. 11. c1 v1 + c2 v 2 = w 7 3 1 1 0 1   −6 −3 0    A =   →  0 1 −2  = E 4 2 0 0 0 0       5 3 −1 0 0 0  We see that the system of 4 equations in 2 unknowns has the unique solution c1 = 1, c2 = −2, so w = v1 − 2 v 2 . 12. c1 v1 + c2 v 2 = w  7 −2 4  1 0 2  3 −2 −4   5 A =   → 0 1  = E  −1 1 3 0 0 0      9 −3 3  0 0 0 We see that the system of 4 equations in 2 unknowns has the unique solution c1 = 2, c2 = 5, so w = 2 v1 + 5v 2 . 13. c1 v1 + c2 v 2 = w 1 5 5 1 0 0 A =  5 −3 2  → 0 1 0  = E      −3 4 −2    0 0 1    The last row of E corresponds to the scalar equation 0c1 + 0c2 = 1, so the system of 3 equations in 2 unknowns is inconsistent. This means that w cannot be expressed as a linear combination of v1 and v2. 14. c1v1 + c2 v 2 + c3 v 3 = w 1 0 0 2  1 0 0 0  0 1 − 1 − 3  0 A =   → 0 1 0  = E 0 −2 1 2  0 0 1 0      3 0 1 −3 0 0 0 1 The last row of E corresponds to the scalar equation 0c1 + 0c2 + 0c3 = 1, so the system of 4 equations in 3 unknowns is inconsistent. This means that w cannot be expressed as a linear combination of v1, v2, and v3.
  • 4. 15. c1v1 + c2 v 2 + c3 v 3 = w  2 3 1 4 1 0 0 3   −1 0 2 5  → 0 1 0 −2  = E A =      4 1 −1 6    0 0 1 4    We see that the system of 3 equations in 3 unknowns has the unique solution c1 = 3, c2 = −2, c3 = 4, so w = 3v1 − 2 v 2 + 4 v 3 . 16. c1v1 + c2 v 2 + c3 v 3 = w 2 4 1 7 1 0 0 6 0 1 3 7  0 1 0 −2  A =  →   = E 3 3 −1 9  0 0 1 3     1 2 3 11 0 0 0 0 We see that the system of 4 equations in 3 unknowns has the unique solution c1 = 6, c2 = −2, c3 = 3, so w = 6 v1 − 2 v 2 + 3v 3 . In Problems 17-22, A = [ v1 v2 v 3 ] is the coefficient matrix of the homogeneous linear system corresponding to the vector equation c1v1 + c2 v 2 + c3 v 3 = 0. Inspection of the indicated reduced echelon form E of A then reveals whether or not a nontrivial solution exists. 1 2 3  1 0 0  17. 0 −3 5  → 0 1 0  = E A =     1 4 2    0 0 1    We see that the system of 3 equations in 3 unknowns has the unique solution c1 = c2 = c3 = 0, so the vectors v1 , v 2 , v 3 are linearly independent.  2 4 −2   1 0 −3 / 5  18.  0 −5 1  → 0 1 −1/ 5  = E A =      −3 −6 3    0 0  0   We see that the system of 3 equations in 3 unknowns has a 1-dimensional solution space. If we choose c3 = 5 then c1 = 3 and c2 = 1. Therefore 3v1 + v 2 + 5 v3 = 0. 2 5 2  1 0 0  0 4 −1  0 19. A =   → 0 1  = E  3 −2 1  0 0 1      0 1 −1 0 0 0
  • 5. We see that the system of 4 equations in 3 unknowns has the unique solution c1 = c2 = c3 = 0, so the vectors v1 , v 2 , v 3 are linearly independent. 1 2 3 1 0 0 1 1  1 0 1 0 20. A =  →   = E  −1 1 4 0 0 1     1 1 1 0 0 0 We see that the system of 4 equations in 3 unknowns has the unique solution c1 = c2 = c3 = 0, so the vectors v1 , v 2 , v 3 are linearly independent. 3 1 1 1 0 1  0 −1  2 0 1 −2  21. A =  →   = E 1 0 1 0 0 0     2 1 0 0 0 0 We see that the system of 4 equations in 3 unknowns has a 1-dimensional solution space. If we choose c3 = −1 then c1 = 1 and c2 = −2. Therefore v1 − 2 v 2 − v 3 = 0. 3 3 5 1 0 7 / 9 9 0  7 0 1 5 / 9 22. A =  →   = E 0 9 5 0 0 0      5 −7 0 0 0 0  We see that the system of 4 equations in 3 unknowns has a 1-dimensional solution space. If we choose c3 = −9 then c1 = 7 and c2 = 5. Therefore 7 v1 + 5 v 2 − 9 v3 = 0. 23. Because v1 and v2 are linearly independent, the vector equation c1u1 + c2u 2 = c1 ( v1 + v 2 ) + c2 ( v1 − v 2 ) = 0 yields the homogeneous linear system c1 + c2 = 0 c1 − c2 = 0. It follows readily that c1 = c2 = 0, and therefore that the vectors u1 and u2 are linearly independent. 24. Because v1 and v2 are linearly independent, the vector equation c1u1 + c2u 2 = c1 ( v1 + v 2 ) + c2 (2 v1 + 3v 2 ) = 0
  • 6. yields the homogeneous linear system c1 + 2c2 = 0 c1 + 3c2 = 0. Subtraction of the first equation from the second one gives c2 = 0, and then it follows from the first equation that c2 = 0 also. Therefore the vectors u1 and u2 are linearly independent. 25. Because the vectors v1 , v 2 , v 3 are linearly independent, the vector equation c1u1 + c2u 2 + c3u 3 = c1 ( v1 ) + c2 ( v1 + 2 v 2 ) + c3 ( v1 + 2 v 2 + 3v 3 ) = 0 yields the homogeneous linear system c1 + c2 + c3 = 0 2c2 + 2c3 = 0 3c3 = 0. It follows by back-substitution that c1 = c2 = c3 = 0, and therefore that the vectors u1 , u 2 , u3 are linearly independent. 26. Because the vectors v1 , v 2 , v 3 are linearly independent, the vector equation c1u1 + c2u 2 + c3u 3 = c1 ( v 2 + v 3 ) + c2 ( v1 + v 3 ) + c3 ( v1 + v 2 ) = 0 yields the homogeneous linear system c2 + c3 = 0 c1 + c3 = 0 c1 + c2 = 0. The reduction 0 1 1  1 0 0  1 0 1  →  0 1 0  = E A =     1 1 0    0 0 1    then shows that c1 = c2 = c3 = 0, and therefore that the vectors u1 , u 2 , u3 are linearly independent.
  • 7. 27. If the elements of S are v1 , v 2 , , v k with v1 = 0, then we can take c1 = 1 and c2 = = ck = 0. This choice gives coefficients c1 , c2 , , ck not all zero such that c1v1 + c2 v 2 + + ck v k = 0. This means that the vectors v1 , v 2 , , v k are linearly dependent. 28. Because the set S of vectors v1 , v 2 , , v k is linearly dependent, there exist scalars c1 , c2 , , ck not all zero such that c1v1 + c2 v 2 + + ck v k = 0. If ck +1 = = cm = 0, then c1 v1 + c2 v 2 + + cm v m = 0 with the coefficients c1 , c2 , , cm not all zero. This means that the vectors v1 , v 2 , , v m comprising T are linearly dependent. 29. If some subset of S were linearly dependent, then Problem 28 would imply immediately that S itself is linearly dependent (contrary to hypothesis). 30. Let W be the subspace of V spanned by the vectors v1 , v 2 , , v k . Because U is a subspace containing each of these vectors, it contains every linear combination of v1 , v 2 , , v k . But W consists solely of such linear combinations, so it follows that U contains W. 31. If S is contained in span(T), then every vector in S is a linear combination of vectors in T. Hence every vector in span(S) is a linear combination of linear combinations of vectors in T. Therefore every vector in span(S) is a linear combination of vectors in T, and therefore is itself in span(T). Thus span(S) is a subset of span(T). 32. If u is another vector in S then the k+1 vectors v1 , v 2 , , v k , u are linearly dependent. Hence there exist scalars c1 , c2 , , ck , c not all zero such that c1v1 + c2 v 2 + + ck v k + cu = 0. If c = 0 then we have a contradiction to the hypothesis that the vectors v1 , v 2 , , v k are linearly independent. Therefore c ≠ 0, so we can solve for u as a linear combination of the vectors v1 , v 2 , , v k . 33. The determinant of the k × k identity matrix is nonzero, so it follows immediately from Theorem 3 in this section that the vectors v1 , v 2 , , v k are linearly independent. 34. If the vectors v1 , v 2 , , v n are linearly independent, then by Theorem 2 the matrix A = [ v1 v 2 v n ] is nonsingular. If B is another nonsingular n × n matrix, then the product AB is also nonsingular, and therefore (by Theorem 2) has linearly independent column vectors. 35. Because the vectors v1 , v 2 , , v k are linearly independent, Theorem 3 implies that some k × k submatrix A0 of A has nonzero determinant. Let A0 consist of the rows i1 , i2 , , ik of the matrix A, and let C0 denote the k × k submatrix consisting of the
  • 8. same rows of the product matrix C = AB. Then C0 = A0B, so C0 = A 0 B ≠ 0 because (by hypothesis) the k × k matrix B is also nonsingular. Therefore Theorem 3 implies that the column vectors of AB are linearly independent.