0% found this document useful (0 votes)
28 views14 pages

Section

1. The document discusses linear combinations of vectors and linear independence/dependence. A linear combination is an expression of vectors multiplied by scalars. 2. Vectors are linearly independent if the only way to satisfy an equation of their linear combination equals zero is if all scalars are zero. Otherwise, the vectors are linearly dependent as one vector can be written as a linear combination of others. 3. The rank of a matrix is defined as the number of vectors in the largest linearly independent subset of the rows of the matrix. To find the rank, perform Gauss-Jordan elimination and the rank equals the number of nonzero rows in the resulting matrix.

Uploaded by

aliabdoub28099
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views14 pages

Section

1. The document discusses linear combinations of vectors and linear independence/dependence. A linear combination is an expression of vectors multiplied by scalars. 2. Vectors are linearly independent if the only way to satisfy an equation of their linear combination equals zero is if all scalars are zero. Otherwise, the vectors are linearly dependent as one vector can be written as a linear combination of others. 3. The rank of a matrix is defined as the number of vectors in the largest linearly independent subset of the rows of the matrix. To find the rank, perform Gauss-Jordan elimination and the rank equals the number of nonzero rows in the resulting matrix.

Uploaded by

aliabdoub28099
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 14

Mathematics-3

Section (5)

Faculty of Information Technology


Egyptian E-Learning University

Fall 2023-2024
Linear Combination:
A vector is a linear combination of vectors if :

are scalars.
Ex 1
Let v1 = (1, 2, 3), v2 = (1, 0, 2).
Express u = (−1, 2, −1) as a linear combination of v1 and v2
Solution:
We must find scalars a1 and a2 such that u = a1v1 + a2v2.
Thus
a1 + a2 = −1
2a1 + 0a2 = 2
3a1 + 2a2 = −1
This is 3 equations in the 2 unknowns a1, a2. Solving for a1, a2:

This system has solution, so u can be expressed as a linear combination of v1 and v2. i.e. u does
lie in the plane generated by v1 and v2
. Linear Independence and Dependence of Vectors
Given any set of m vectors v1, … , vm (with the same number of components), a linear combination of
Given any set of m vectors v1, … , vm (with the same number of components), a linear combination of
these vectors is an expression of the form
these vectors is an expression of the form
c1v1, + c2 v2, + … + cmvm
c1v1, + c2 v2, + … + cmvm
where c1, c2, … , cm are any scalars. Now consider the equation
where c1, c2, … , cm are any scalars. Now consider the equation
c1v1, + c2 v2, + … + cmvm = 0 (1)
c1v1, + c2 v2, + … + cmvm = 0 (1)
Clearly, this vector equation (1) holds if we choose all cj’s zero, because then it becomes 0 = 0. If this
Clearly, this vector equation (1) holds if we choose all cj’s zero, because then it becomes 0 = 0. If this
is the only m-tuple of scalars for which (1) holds, then our vectors v1, … , vm
is the only m-tuple of scalars for which (1) holds, then our vectors v1, … , vm
are said to form a linearly independent set or, more briefly, we call them linearly independent.
are said toif
Otherwise, form a linearly
(1) also holds independent set all
with scalars not or,zero,
more webriefly, we call
call these them linearly independent.
vectors dependent.
Otherwise, if (1) also holds with scalars not all zero, we call these vectors linearly dependent.
This means that we can express at least one of the vectors as a linear combination of the other
This means that we can express at least one of the vectors as a linear combination of the other
vectors. For instance, if (1) holds with, say, c1 ≠ 0, we can solve (1) for v1:
vectors. For instance, if (1) holds with, say, c1 ≠ 0, we can solve (1) for v1:
v 1 = k2 v 2 + … + km v m where kj = −cj /c1.
v 1 = k2 v 2 + … + km v m where kj = −cj /c1.
(Some kj’s may be zero. Or even all of them, namely, if v1 = 0).
(Some kj’s may be zero. Or even all of them, namely, if v1 = 0).

1-A set of vectors is called linearly independent if the only linear combination of vectors in V that
equals 0 is the trivial linear combination (c1 = c2 = … = ck = 0).
2-A set of vectors is called linearly dependent if there is a nontrivial linear combination of vectors in
V that adds up to 0
Ex 4:
The Rank of a Matrix

Let A be any m x n matrix, and denote the rows of A by r1, r2, …, rm.
Define R = {r1, r2, …, rm}.
The rank of A is the number of vectors in the largest linearly independent
subset of R.

To find the rank of matrix A,


apply the Gauss-Jordan method to matrix A.
Let A’ be the final result. It can be shown that the rank of A’ = rank of A
The rank of A’ = the number of nonzero rows in A’.
Therefore, the rank A = rank A’ = number of nonzero rows in A’.

Given V = {[1 0 0], [0 1 0], [1 1 0]}

Form matrix A 1 0 0
1 0 0 After the Gauss-Jordan 0 1 0
0 1 0 A' rank A = rank A’ = 2
A
 
method:  
1 1 0 0 0 0
Solution:
Solution
Solution
Solution
Solution Solution

You might also like