0% found this document useful (0 votes)
31 views4 pages

Chapter 3 List of Concepts

list of concepts chapter 3 lin alg

Uploaded by

ashyam3
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views4 pages

Chapter 3 List of Concepts

list of concepts chapter 3 lin alg

Uploaded by

ashyam3
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

LIST OF CONCEPTS - CHAPTER 3

Note: Theorems and page numbers refer to the Additional Lecture Notes of the corresponding sections.

Section 3.1: Vector spaces


Definition:
A Vector Space is a nonempty set, V , of objects called vectors, together with rules for adding any
two vectors x and y in V and for multiplying any vector x in V by any scalar α in R.
V must be closed under this vector addition and scalar multiplication so that x ⊕ y and α x are
both in V . Moreover, the axioms A1-A8 on page 1 must be satisfied for all vectors x, y, and z in V
and all scalars α and β in R (You do not need to memorize the axioms).

Examples of Vector Spaces:


• Rn (the set of all vectors with n components) with the usual operations of addition and scalar
multiplication is a vector space for all positive integers n.
• Rm×n (the set of all m × n matrices) with the usual operations of addition and scalar multi-
plication is a vector space for all positive integers m and n.
• We denote by Pn the set of all polynomials of degree strictly less than n. Pn is a vector space
with the usual addition and scalar multiplication of polynomials.
• We denote by C([a, b]) the set of all continuous functions defined on the interval [a, b]. C([a, b]
is a vector space with the usual addition and scalar multiplication of functions.

Section 3.2: Subspaces


1. A subset S of a vector space V is a subspace of V if and only if it is non empty and satisfies
the two closure properties (i) and (ii) on page 1. A subspace always contains the zero vectors.
Examples of subspaces in R3 are planes thorough the origin (2-dimensional subspaces) and
lines through the origin (1-dimensional subspaces).
2. Let v1 , v2 , . . . , vk be vectors in a vector space V . The set of all linear combinations of these
vectors is a subspace of V called the span of the vectors and denoted by span(v1 , v2 , . . . , vk )
(pg. 7). It is the smallest subspace of V containing all the vectors vi , i = 1, . . . , k . To
determine if a vector v is in span(v1 , v2 , . . . , vk ), one needs to set up the system v = x1 v1 +
x2 v1 + · · · + xk vk and check whether the system is consistent or inconsistent. If the system is
consistent, then the vector v is in span(v1 , v2 , . . . , vk ).
3. The Nullspace , N (A), of an m × n matrix A is the subspace of Rn consisting of all solutions
of the homogeneous system Ax = 0.
• To find a basis for N (A) :
(a) Reduce the matrix A to RREF
(b) Assign parameters to the free variables
(c) Write the solution as a linear combination of vectors (with coefficients the parameters
from b.)
(d) The vectors in the linear combination will be a basis of N (A)
• If you are given a vector v and you need to determine whether this vector is in N (A), all
you have to do is evaluate the product Av. If this product is the zero vector, then v is
in N (A).

1
• A set of vectors is a Spanning Set for the vector space V if any vector in V can be
written as a linear combination of the vectors in the set (pg. 8).

Section 3.3: Linear Independence

1. Linear independence is a generalization of the idea that a set of vectors are collinear, coplanar,
etc. (see “Geometric Interpretation” on pg. 4 ). For example, the vectors [1, 1]T and [2, 2]T
are multiples of each other and are linearly dependent (they are also collinear). The vectors
[1, 0, 0]T , [1, 1, 0]T and [0, 1, 0]T are also linearly dependent since they all lie in the xy-plane.
It becomes difficult to visualize the notion of collinearity and coplanarity in more than 3
dimensions. Instead, a simpler analytical definition is given:

Let V be a vector space:


Definition: A set of vectors {v1 , v2 , . . . , vn } in V is linearly dependent if the vector
equation
c1 v 1 + c2 v 2 + · · · + cn v n = 0 (1)
is satisfied for at least one ci 6= 0.
The set is linearly independent if (1) is satisfied only if c1 = c2 = · · · = cn = 0.

To check whether a set of vectors is linearly independent or linearly dependent, one can proceed
in different ways. First note that equation (1) can be written as Ac = 0 where A is the matrix
whose columns are the vectors v1 , v2 , . . . , vn , and c = [c1 , c2 , . . . , cn ]T .

(a) One way that works all the time is to set up a system as in Example 1, Example 2, or
Example 3 on pages 1 and 2, and check whether the zero vector is the only solution or
not. If it is, then the vectors are linearly independent.
(b) Another way that works only when we have n vectors in Rn is to check whether the
determinant of the matrix A is zero or not. If it is non zero, then the vectors are linearly
independent (see the first Example on page 3)
(c) Also remember that any set of more than n vectors in a vector space of dimension n, is
linearly dependent (page 2)

2. A finite list of non zero vectors in V forms a linearly independent set if and only if no vector
in the list can be expressed as a a linear combination of its predecessors (a single non zero
vector is linearly independent).
3. The following statements are equivalent for n vectors in Rn :
(a) The vectors are linearly independent.
(b) The vectors span Rn .
(c) A matrix having the vectors as columns is non singular.
4. The definition of linear dependence or independence can be used to test whether a set of
polynomials in Pn is linearly independent (see Examples on pages 4 and 5). Elements in
the vector space of matrices Rm×n can also be tested for dependence of independence using
the definition. Alternatively, we can rewrite the polynomials and the matrices as vectors (as
learned in Sections 3.4 and 3.5) and then test for linear dependence or independence of these
vectors.

2
Section 3.4: Basis and Dimension
Let V be a vector space:
1. A set of vectors is a basis of V if it is a linearly independent set of vectors and V is spanned
by the set.
2. If {v1 , v2 , . . . , vn } is a basis for V , then each vector in V can be expressed in the form

v = α 1 v 1 + α 2 v2 + · · · + α n v n

for unique scalars α1 , α2 , . . . , αn


3. A basis of V is the minimal spanning set of V and the largest set of linearly independent
vectors.
4. Any finite spanning set of V can be reduced (if necessary) to a basis of V by deleting zero
vectors, and then casting out those that are linear combinations of the predecessors. See
Section 3.6, item 5 below, for a casting out technique in Rn that can be executed efficiently.
5. All bases of V have the same number of vectors. This number is called the dimension of V
and denoted dim(V ).
6. If dim(V ) = n, then any set of n linearly independent vectors is a basis of V .
7. See the homework and lecture notes for examples of bases of subspaces in Rm×n and Pn
(Examples 1, 2 on pg. 5, Examples 5, 6, 7, 8 on pg. 6, 7, 8).
8. Read the Remark on page 8 about the relation between the dimension and the degrees of
freedom.

Section 3.5: Change of Basis


Let V be a vector space with the ordered basis V = {v1 , v2 , . . . , vn }.

1. Each vector x in V has a unique representation as a linear combination:

x = α 1 v 1 + α 2 v2 + · · · + α n v n .

The vector [x]V = [α1 , α2 , . . . , αn ]T , for the uniquely determined scalars αi in the previous
equation, is the coordinate vector of x relative to the basis V. The vector x can be written
as
x = V [x]V
where V is the matrix whose columns are the basis vectors. V is called the transition matrix
from the basis V to the standard basis.
2. Let V be the basis given in item 1. and U = {u1 , u2 , . . . , un } be another ordered basis of Rn .
The important equation that relates these bases and the corresponding coordinate vectors is

U [x]U = V [x]V

where U is the transition matrix from U to the standard basis (i.e. the matrix whose columns
are the vectors ui ) and V is the transition matrix from V to the standard basis (i.e. the matrix
whose columns are the vectors vi ).

3
Solving for [x]U gives [x]U = U −1 V [x]V , and S = U −1 V is called the transition matrix
from the basis V to the basis U.
Similarly, solving for [x]V gives [x]V = V −1 U [x]U , and T = V −1 U is called the transition
matrix from the basis U to the basis V.

3. See the lecture notes and the homework for examples of coordinate vectors in Rm×n and in Pn
(page 4, 5, and 6).

Section 3.6: Row Space and Column Space


1. Let A be an m × n matrix.
• The Row Space of A, denoted by R(AT ), is the subspace of Rn spanned by the row
vectors of A
• The Column Space of A, denoted by R(A), is the subspace of Rm spanned by the
column vectors of A.
2. To determine a basis for the row space of a matrix A, proceed as follows:
(a) Reduce A to echelon form U .
(b) The nonzero rows of U constitute a basis for the row space of A.
(Example page 1)
3. To determine a basis for R(A), the column space of A, proceed as follows
(a) Reduce A to echelon form U .
(b) The columns of A corresponding to the columns of U containing the pivots form a basis
of R(A).
4. Consistency Theorem: The system Ax = b is consistent if and only if b lies in the column
space of A. This follows from the fact that Ax = b is equivalent to x1 a1 +x2 a2 +· · ·+xn an = b,
where a1 , a2 , . . . , an are the columns of the matrix A (page 2).
5. Given a set of n vectors {v1 , v2 , . . . , vn }, we can determine a basis for span(v1 , v2 , . . . , vn )
(i.e. we can eliminate vectors that are linear combination of the predecessors), by proceeding
as follows:
(a) Form the matrix A with columns v1 , v2 , . . . , vn
(b) Reduce A to row echelon form
(c) Determine which columns in the RREF contain the pivots. These same columns in A will
form a basis of the column space of A and therefore a basis of span(v1 , v2 , . . . , vn ).
(first Example on page 6)
6. The dimension of the row space (which is the same as the dimension of the column space) of
an m × n matrix A is called the rank of A. The dimension of the nullspace of A is called the
nullity of A.
7. Rank - Nullity Theorem: The nullity of A plus the rank of A equals the number of columns
of A.
8. An n × n matrix A is invertible if and only if rank(A) = n.

You might also like