0% found this document useful (0 votes)
86 views55 pages

2 La

The document outlines a lecture on Linear Algebra, covering topics such as systems of linear equations, matrices, vector spaces, and linear mappings. It provides definitions, examples, and properties related to these concepts, including matrix operations and the process of solving linear equations. The lecture emphasizes the importance of understanding matrix representation and the methods for finding solutions to linear systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views55 pages

2 La

The document outlines a lecture on Linear Algebra, covering topics such as systems of linear equations, matrices, vector spaces, and linear mappings. It provides definitions, examples, and properties related to these concepts, including matrix operations and the process of solving linear equations. The lecture emphasizes the importance of understanding matrix representation and the methods for finding solutions to linear systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

Lecture 2: Linear Algebra

Yi, Yung (이융)

Mathematics for Machine Learning


https://yung-web.github.io/home/courses/mathml.html
KAIST EE

April 8, 2021

April 8, 2021 1 / 56
Roadmap

(1) Systems of Linear Equations


(2) Matrices
(3) Solving Systems of Linear Equations
(4) Vector Spaces
(5) Linear Independence
(6) Basis and Rank
(7) Linear Mappings
(8) Affine Spaces

April 8, 2021 2 / 56
Roadmap

(1) Systems of Linear Equations


(2) Matrices
(3) Solving Systems of Linear Equations
(4) Vector Spaces
(5) Linear Independence
(6) Basis and Rank
(7) Linear Mappings
(8) Affine Spaces

L2(1) April 8, 2021 3 / 56


Linear Algebra

• Algebra: a set of objects and a set of rules or operations to manipulate those objects
• Linear algebra
◦ Object: vectors v
◦ Operations: their additions (v + w ) and scalar multiplication (kv )
• Examples
◦ Geometric vectors
- High school physics
◦ Polynomials
◦ Audio signals
◦ Elements of Rn

L2(1) April 8, 2021 4 / 56


System of Linear Equations

• For unknown variables (x1 , · · · , xn ) ∈ Rn ,


a11 x1 + · · · + a1n xn = b1
..
.
am1 x1 + · · · + amn xn = bm
• Three cases of solutions
- No solution - Unique solution - Infinitely many solutions
x1 + x2 + x3 = 3 x1 + x2 + x3 = 3 x1 + x2 + x3 = 3
x1 − x2 + 2x3 = 2 x1 − x2 + 2x3 = 2 x1 − x2 + 2x3 = 2
2x1 + 3x3 = 1 x2 + 3x3 = 1 2x1 + 3x3 = 5

• Question. Under what conditions, one of the above three cases occur?

L2(1) April 8, 2021 5 / 56


Matrix Representation
• A collection of linear equations
a11 x1 + · · · + a1n xn = b1
..
.
am1 x1 + · · · + amn xn = bm
• Matrix representations:
          
a11 a1n b1 a11 · · · a1n x1 b1
 ..   .   .   . ..   ..  =  .. 
 .  x1 + · · · +  ..  xn =  ..  ⇐⇒  .. .  .   . 
am1 amn bm am1 · · · amn xn bm
| {z } | {z } | {z }
A x b
• Understanding A is the key to answering various questions about this linear system
Ax = b.

L2(1) April 8, 2021 6 / 56


Roadmap

(1) Systems of Linear Equations


(2) Matrices
(3) Solving Systems of Linear Equations
(4) Vector Spaces
(5) Linear Independence
(6) Basis and Rank
(7) Linear Mappings
(8) Affine Spaces

L2(2) April 8, 2021 7 / 56


Matrix: Addition and Multiplication
• For two matrices A ∈ Rm×n and B ∈ Rm×n ,
 
a11 + b11 · · · a1n + b1n
.. .. m×n
A + B :=  ∈R
 
. .
am1 + bm1 · · · amn + bmn
• For two matrices A ∈ Rm×n and B ∈ Rn×k , the elements cij of the product
C = AB ∈ Rm×k is:
n
X
cij = ail blj , i = 1, . . . , m, j = 1, . . . , k.
l=1
 
  0 2
1 2 3
• Example. A = and B = 1 −1, compute AB and BA.
3 2 1
0 1

L2(2) April 8, 2021 8 / 56


Identity Matrix and Matrix Properties
• A square matrix1 In with Iii = 1 and Iij=0 for i 6= j, where n is the number of rows
and columns. For example,
 
  1 0 0 0
1 0 0 1 0 0
I2 = , I4 =  
0 1  0 0 1 0
0 0 0 1

• Associativity: For A ∈ Rm×n , B ∈ Rn×p , C ∈ Rp×q , (AB)C = A(BC )


• Distributivity: For A, B ∈ Rm×n , and C , D ∈ Rn×p ,
(i) (A + B)C = AC + BC and (ii) A(C + D) = AC + AD
• Multiplication with the identity matrix: For A ∈ Rm×n , Im A = AIn = A

1
# of rows = # of cols
L2(2) April 8, 2021 9 / 56
Inverse and Transpose
• For a square matrix A ∈ Rn×n , B is the • For a matrix A ∈ Rm×n , B ∈ Rn×m
inverse of A, denoted by A−1 , if with bij = aji is the transpose of A,
AB = In = BA. which we denote by AT .
 
0 2
• Called regular/invertible/nonsingular, if
• Example. For A = 1 −1,
it exists.
0 1
• If it exists, it is unique.  
0 1 0
AT =
• How to compute? For 2 × 2 matrix, 2 −1 1
 
1 a22 −a 12 • If A = AT , A is called symmetric.
A−1 =
a11 a22 − a12 a21 −a21 a11

L2(2) April 8, 2021 10 / 56


Inverse and Transpose: More Properties
• AA−1 = I = A−1 A

• (AB)−1 = B −1 A−1

• (A + B)−1 6= A−1 + B −1

T
• (AT ) = A

• (A + B)T = AT + B T

• (AB)T = B T AT

• If A is invertible, so is AT .

L2(2) April 8, 2021 11 / 56


Scalar Multiplication
• Multiplication by a scalar λ ∈ R to A ∈ Rm×n
   
0 2 0 6
• Example. For A = 1 −1, 3 × A = 3 −3
0 1 0 3

• Associativity
◦ (λψ)C = λ(ψC )
◦ λ(BC ) = (λB)C = B(λC ) = (BC )λ
T
◦ (λC ) = C T λT = C T λ = λC T
• Distributivity
◦ (λ + ψ)C = λC + ψC
◦ λ(B + C ) = λB + λC

L2(2) April 8, 2021 12 / 56


Roadmap

(1) Systems of Linear Equations


(2) Matrices
(3) Solving Systems of Linear Equations
(4) Vector Spaces
(5) Linear Independence
(6) Basis and Rank
(7) Linear Mappings
(8) Affine Spaces

L2(3) April 8, 2021 13 / 56


Example
• ρi : i-th equation
−3x + 2z = −1
x − 2y + 2z = −5/3 • Express the equation as its
−x − 4y + 6z = −13/3 augmented matrix.

   
−3 0 2 −1 −3 0 2 −1
(1/3)ρ1 +ρ2
 1 −2 2 −5/3 −→  0 −2 8/3 −2
−(1/3)ρ1 +ρ3
−1 −4 6 −13/3 0 −4 16/3 −4
 
−3 0 2 −1
−2ρ2 +ρ3
−→  0 −2 8/3 −2
0 0 0 0

The two nonzero rows give −3x + 2z = −1 and −2y + (8/3)z = −2.

1
Examples from this slide to the next several slides come from Jim Hefferson’s Linear Algebra book.
L2(3) April 8, 2021 14 / 56
- Parametrizing −3x + 2z = −1 and −2y + (8/3)z = −2 gives:
x = (1/3) + (2/3)z      
x 1/3 2/3
y = 1 + (4/3)z {y  =  1  + 4/3 z | z ∈ R}
z =z z 0 1

This helps us understand the set of solutions, e.g., each value of z gives a different solution.

z 0 1 2 −1/2
         
x 1/3 1 5/3 0
solution y   1  7/3 11/3  1/3 
         

z 0 1 2 −1/2

L2(3) April 8, 2021 15 / 56


Form of solution sets

x + 2y − z =2
• The system reduces in this way.
2x − y − 2z + w = 5
   
1 2 −1 0 2 −2ρ1 +ρ2 1 2 −1 0 2
−→
2 −1 −2 1 5 0 −5 0 1 1
• It has solutions of this form.
       
x 12/5 1 −2/5
 y  −1/5 0
 +   z +  1/5  w
 
 =
 z   0  1  0  for z, w ∈ <
w 0 0 1
• Note that taking z = w = 0 shows that the first vector is a particular solution of
the system.

L2(3) April 8, 2021 16 / 56


General = Particular + Homogeneous

• General approach
1. Find a particular solution to Ax = b
2. Find all solutions to the homogeneous equation Ax = 0
I 0 is a trivial solution

3. Combine the solutions from steps 1. and 2. to the general solution


• Questions: A formal algorithm that performs the above?
◦ Gauss-Jordan method: convert into a “beautiful” form
(formally reduced row-echelon form)
◦ Elementary transformations: (i) row swapping (ii) multiply by a constant (iii) row
addition
• Such a form allows an algorithmic way of solving linear equations

L2(3) April 8, 2021 17 / 56


Example: Unique Solution
• Start as usual by getting echelon form.

x+ y− z= 2 x+ y− z= 2 x+ y− z= 2
−2ρ1 +ρ2 −1ρ2 +ρ3
2x − y = −1 −→ − 3y + 2z = −5 −→ − 3y + 2z = −5
−1ρ1 +ρ3
x − 2y + 2z = −1 − 3y + 3z = −3 z= 2

• Make all the leading entries one.

x +y − z= 2
(−1/3)ρ2
−→ y − (2/3)z = 5/3
z= 2

• Finish by using the leading entries to eliminate upwards, until we can read off the solution.

x +y − z= 2 x +y =4 x =1
ρ3 +ρ1 −ρ2 +ρ1
y − (2/3)z = 5/3 −→ y =3 −→ y =3
(2/3)ρ3 +ρ2
z= 2 z =2 z =2

L2(3) April 8, 2021 18 / 56


Example: Infinite Number of Solutions
x −y − 2w = 2 • Eliminate upwards.
x + y + 3z + w = 1  
−y + z − w =0 1 −1 0 −2 2
−(3/2)ρ3 +ρ2
−→ 0 1 0 6/5 −1/5
• Start by getting echelon form and turn the 0 0 1 1/5 −1/5
leading entries to 1’s.  
1 0 0 −4/5 9/5
ρ2 +ρ1

1 −1 0 −2 2
 −→ 0 1 0 6/5 −1/5
−1ρ1 +ρ2 0 0 1 1/5 −1/5
−→ 0 2 3 3 −1
0 −1 1 −1 0 •
  The parameterized solution set is:
1 −1 0 −2 2
(1/2)ρ2 +ρ3    
−→ 0 2 3 3 −1  9/5 4/5
0 0 5/2 1/2 −1/2 −1/5 −6/5
  {
−1/5 + −1/5 w | w ∈ R}
  
1 −1 0 −2 2
(1/2)ρ2 0 1
−→ 0 1 3/2 3/2 −1/2
(2/5)ρ3
0 0 1 1/5 −1/5

L2(3) April 8, 2021 19 / 56


Cases of Solution Sets

number of solutions of the


homogeneous system
one infinitely many
unique infinitely many
particular yes
solution solutions
solution
exists? no no
no
solutions solutions

L2(3) April 8, 2021 20 / 56


Algorithms for Solving System of Linear Equations
1. Pseudo-inverse
−1
Ax = b ⇐⇒ AT Ax = AT b ⇐⇒ x = (AT A) AT b
−1
◦ (AT A) AT : Moore-Penrose pseudo-inverse
◦ many computations: matrix product, inverse, etc
2. Gaussian elimination
◦ intuitive and constructive way
◦ cubic complexity (in terms of # of simultaneous equations)
3. Iterative methods
◦ practical ways to solve indirectly
(a) stationary iterative methods: Richardson method, Jacobi method, Gaus-Seidel method,
successive over-relaxation method
(b) Krylov subspace methods: conjugate gradients, generalized minimal residual,
biconjugate gradients

L2(3) April 8, 2021 21 / 56


Roadmap

(1) Systems of Linear Equations


(2) Matrices
(3) Solving Systems of Linear Equations
(4) Vector Spaces
(5) Linear Independence
(6) Basis and Rank
(7) Linear Mappings
(8) Affine Spaces

L2(4) April 8, 2021 22 / 56


Group

• A set G and an operation ⊗ : G × G 7→ G. G := (G, ⊗) is called a group, if:


1. Closure. ∀x, y ∈ G, x ⊗ y ∈ G
2. Associativity. ∀x, y , z ∈ G, (x ⊗ y ) ⊗ z = x ⊗ (y ⊗ z)
3. Neutral element. ∃e ∈ G, ∀x ∈ G, x ⊗ e = x and e ⊗ x = x
4. Inverse element. ∀x ∈ G, ∃y ∈ G, x ⊗ y = e and y ⊗ x = e. We often use x −1 = y .

• G = (G, ⊗) is an Abelian group, if the following is additionally met:


◦ Communicativity. ∀x, y ∈ G, x ⊗ y = y ⊗ x

L2(4) April 8, 2021 23 / 56


Examples
• (Z, +) is an Abelian group
• (N ∪ {0}, +) is not a group (because inverses are missing)
• (Z, ·) is not a group
• (R, ·) is not a group (because of no inverse for 0)
• (Rn , +), (Zn , +) are Abelian, if + is defined componentwise
• (Rm×n , +) is Abelian (with componentwise +)
• (Rn×n , ·)
◦ Closure and associativity follow directly
◦ Neutral element: In
◦ The inverse A−1 may exist or not. So, generally, it is not a group. However, the set of
invertible matrices in Rn×n with matrix multiplication is a group, called general linear
group.

L2(4) April 8, 2021 24 / 56


Vector Spaces
Definition. A real-valued vector space V = (V, +, ·) is a set V with two operations
(a) + : V × V 7→ V (vector addition)
(b) · : R × V 7→ V (scalar multiplication),
where
1. (V, +) is an Abelian group
2. Distributivity.
◦ ∀λ ∈ R, x, y ∈ V, λ · (x + y ) = λ · x + λy
◦ ∀λ, ψ ∈ R, x ∈ V, (λ + ψ) · x = λ · x + ψ · x

3. Associativity. ∀λ, ψ ∈ R, x ∈ V, λ · (ψ · x) = (λψ) · x


4. Neutral element. ∀x ∈ V, 1 · x = x

L2(4) April 8, 2021 25 / 56


Example

• V = Rn with
◦ Vector addition: x + y = (x1 + y1 , . . . , xn + yn )
◦ Scalar multiplication: λx = (λx1 , . . . , λxn )

• V = Rm×n with  
a11 + b11 ··· a1n + b1n
◦ Vector addition: A + B = 
 .. .. 
. . 
am1 + bm1 · · · amn + bmn
 
λa11 · · · λa1n
◦ Scalar multiplication: λA =  ... .. 

. 
λam1 · · · λamn

L2(4) April 8, 2021 26 / 56


Vector Subspaces

Definition. Consider a vector space V = (V, +, ·) and U ⊂ V. Then, U = (U, +, ·) is


called vector subspace (simply linear subspace or subspace) of V if U is a vector space
with two operations ‘+’ and ‘·’ restricted to U × U and R × U.

Examples

• For every vector space V , V and {0} are the trivial subspaces.
• The solution set of Ax = 0 is the subspace of Rn .
• The solution of Ax = b (b 6= 0) is not a subspace of Rn .
• The intersection of arbitrarily many subspaces is a subspace itself.

L2(4) April 8, 2021 27 / 56


Roadmap

(5) Systems of Linear Equations


(5) Matrices
(5) Solving Systems of Linear Equations
(5) Vector Spaces
(5) Linear Independence
(5) Basis and Rank
(5) Linear Mappings
(5) Affine Spaces

L2(5) April 8, 2021 28 / 56


Linear Independence

• Definition. For a vector space V and vectors x1 , . . . , xn ∈ V , every v ∈ V of the


form v = λ1 x1 + · · · + λk xk with λ1 , . . . , λk ∈ R is a linear combination of the
vectors x1 , . . . , xn ∈ V .
Pk
• Definition. If there is a non-trivial linear combination such that 0 = i=1 λi xi with
at least one λi 6= 0, the vectors x1 , . . . , xn are linearly dependent. If only the trivial
solution exists, i.e., λ1 = . . . = λk = 0, x1 , . . . , xn are linearly independent.
• Meaning. A set of linearly independent vectors consists of vectors that have no
redundancy.
• Useful fact. The vectors {x1 , . . . , xn } are linearly dependent, iff (at least) one of
them is a linear combination of the others.
◦ x − 2y = 2 and 2x − 4y = 4 are linearly dependent.

L2(5) April 8, 2021 29 / 56


Checking Linear Independence
• Gauss elimination to get the row echelon form
• All column vectors are linearly independent iff all columns are pivot columns (why?).
• Example.
     
1 1 −1
2 1 −2
x1 =   , x2 = 
 
0 ,
 x3 =  
−3 1
4 2 1
   
1 1 −1 1 1 −1
 2 1 −2 0 1 0 

−3 0 1 
 ··· 
0 0 1 

4 2 1 0 0 0

• Every column is a pivot column. Thus, x1 , x2 , x3 are linearly independent.

L2(5) April 8, 2021 30 / 56


Linear Combinations of Linearly Independent Vectors

• Vector space V with k linearly independent vectors b1 , b2 , . . . , bk


• m linear combinations x1 , x2 , . . . , xm . (Q) Are they linearly independent?

λj
x1 = λ11 b1 + λ21 b2 + · · · + λk1 bk z }| {
.. B λ1j
.
z }| {  . 
xj = b1 , · · · , bk  .. , xj = Bλj
xm = λ1m b1 + λ2m b2 + · · · + λkm bk
λkj

Pm Pm Pm
j=1 ψj xj = j=1 ψj Bλj =B j=1 ψj λj

• {x} linearly independent ⇐⇒ {λ} linearly independent

L2(5) April 8, 2021 31 / 56


Example

x1 = b1 − 2b2 + b3 − b4
x2 = −4b1 − 2b2 + 4b4
x3 = 2b1 + 3b2 − b3 − 3b4
x4 = 17b1 − 10b2 + 11b3 + b4
   
1 −4 2 17 1 0 0 −7
 −2 −2 3 −10 0 1 0 −15
A = λ1 λ2 λ3 λ4 =   ···  
1 0 −1 11  0 0 1 −18
−1 −4 −3 1 0 0 0 0

• The last column is not a pivot column. Thus, x1 , x2 , x3 , x3 are linearly dependent.

L2(5) April 8, 2021 32 / 56


Roadmap

(1) Systems of Linear Equations


(2) Matrices
(3) Solving Systems of Linear Equations
(4) Vector Spaces
(5) Linear Independence
(6) Basis and Rank
(7) Linear Mappings
(8) Affine Spaces

L2(6) April 8, 2021 33 / 56


Generating Set and Basis

• Definition. A vector space V = (V, +, ·) and a set of vectors A = {x1 , . . . , xk } ⊂ V.


◦ If every v ∈ V can be expressed as a linear combination of x1 , . . . , xk , A is called a
generating set of V .
◦ The set of all linear combinations of A is called the span of A.
◦ If A spans the vector space V , we use V = span[A] or V = span[x1 , . . . , xk ]
• Definition. The minimal generating set B of V is called basis of V . We call each
element of B basis vector. The number of basis vectors is called dimension of V .
• Properties
◦ B is a maximally2 linearly independent set of vectors in V .
◦ Every vector x ∈ V is a linear combination of B, which is unique.

2
Adding any other vector to this set will make it linearly dependent.
L2(6) April 8, 2021 34 / 56
Examples

• Different bases R3
           
1 0 0 1 1 1
B1 = {0 , 1 , 0}, B2 = {0 , 1 , 1},
0 0 1 0 0 1
     
0.5 1.8 −2.2
B3 = {0.8 , 0.3 , −1.3}
0.4 0.3 3.5
• Linearly independent, but not maximal. Thus, not a basis.
     
1 2 1
2 −1  1 
A = {3 ,  0  ,  0 }
    

4 2 −4

L2(6) April 8, 2021 35 / 56


Determining a Basis

• Want to find a basis of a subspace U = span[x1 , x2 , . . . , xm ]



1. Construct a matrix A = x1 x2 · · · xm
2. Find the row-echelon form of A.
3. Collect the pivot columns.
• Logic: Collect xi so that we have only trivial solution. Pivot columns tell us which
set of vectors is linearly independent.
• See example 2.17 (pp. 35)

L2(6) April 8, 2021 36 / 56


Rank (1)

• Definition. The rank of A ∈ Rm×n denoted by rk(A) is # of linearly independent


columns
◦ Same as the number of linearly independent rows
   
1 2 1 1 2 1
• A = −2 −3 1 ··· 0 1 3
3 5 0 0 0 0

Thus, rk(A) = 2.

• rk(A) = rk(AT )

L2(6) April 8, 2021 37 / 56


Rank (2)

• The columns (resp. rows) of A span a subspace U (resp. W ) with dim(U) = rk(A)
(resp. dim(W ) = rk(A)), and a basis of U (resp. W ) can be found by Gauss
elimination of A (resp. AT ).

• For all A ∈ Rn×n , rk(A) = n, iff A is regular (invertible).

• The linear system Ax = b is solvable, iff rk(A) = rk(A|b).

• For A ∈ Rm×n , the subspace of solutions for Ax = 0 possesses dimension n − rk(A).

• A ∈ Rm×n has full rank if its rank equals the largest possible rank for a matrix of the
same dimensions. The rank of the full-rank matrix A is min(# of cols, # of rows).

L2(6) April 8, 2021 38 / 56


Roadmap

(1) Systems of Linear Equations


(2) Matrices
(3) Solving Systems of Linear Equations
(4) Vector Spaces
(5) Linear Independence
(6) Basis and Rank
(7) Linear Mappings
(8) Affine Spaces

L2(7) April 8, 2021 39 / 56


Linear Mapping (1)

• Interest: A mapping that preserves the structure of the vector space


• Definition. For vector spaces V , W , a mapping Φ : V 7→ W is called a linear
mapping (or homomorphism/linear transformation), if, for all x, y ∈ V and all
λ ∈ R,
◦ Φ(x + y ) = Φ(x) + Φ(y )
◦ Φ(λx) = λΦ(x)
• Definition. A mapping Φ : V 7→ W is called
◦ Injective (단사), if ∀x, y ∈ V, Φ(x) = Φ(y ) =⇒ x = y
◦ Surjective (전사), if Φ(V) = W
◦ Bijective (전단사), if it is injenctive and surjective.

L2(7) April 8, 2021 40 / 56


Linear Mapping (2)

• For bjective mapping, there exists an inverse mapping Φ−1 .


• Isomorphism if Ψ is linear and bijective.
• Theorem. Vector spaces V and W are isomorphic, iff dim(V ) = dim(W ).
◦ Vector spaces of the same dimension are kind of the same thing.
• Other properties
◦ For two linear mappings Φ and Ψ, Φ ◦ Ψ is also a linear mapping.
◦ If Φ is an isomorphism, so is Φ−1 .
◦ For two linear mappings Φ and Ψ, Φ + Ψ and λΨ for λ ∈ R are linear.

L2(7) April 8, 2021 41 / 56


Coordinates

• A basis defines a coordinate


system.

• Consider an ordered basis B = (b1 , b2 , . . . , bn ) of vector space V . Then, for any


x ∈ V , there exists a unique linear combination
x = α1 b1 + . . . + αn bn .
 
α1
• We call α =  ...  the coordinate of x with respect to B = (b1 , b2 , . . . , bn ).
 

αn
• Basis change =⇒ Coordinate change

L2(7) April 8, 2021 42 / 56


Basis Change

• Consider a vector space V and two coordinate systems defined by B = (b1 , . . . , bn )


and B 0 = (b10 , . . . , bn0 ).
• Question. For (x1 , . . . , xn )B → (y1 , . . . , yn )B 0 , what is (y1 , . . . , yn )B 0 ?
   
y1 x1
Theorem.  ...  = b10 . . . bn0 b1 . . . bn  ... 
 −1 

   

yn xn
−1
b10 0

• Regard AΦ = ... bn b1 . . . bn as a linear map

L2(7) April 8, 2021 43 / 56


Example

• B = ((1, 0), (0, 1) and B 0 = ((2, 1), (1, 2))


• (4, 2)B → (x, y )B 0 ?
   
y1 x1
 ..  0 0
−1  . 
• Using  .  = b1 . . . bn b1 . . . bn  ..  ,
yn xn
   −1     2 1
   
x 2 1 1 0 4 3 −3 4 2
= = =
y 1 2 0 1 2 − 31 2
3 2 0

L2(7) April 8, 2021 44 / 56


Transformation Matrix
• Two vector spaces
◦ V with basis B = (b1 , . . . , bn ) and W with basis C = (c1 , . . . , cm )
• What is the coordinate in C -system for each basis bj ? For j = 1, . . . , n,
 
α1j
· · · cm  ... 

bj = α1j c1 + · · · + αmj cm ⇐⇒ bj = c1

αmj

z
 }| {
α11 · · · α1n
  . .. 
=⇒ b1 · · · bn = c1 · · · cm  .. . 
αm1 · · · αmn

• x̂ = AΦ ŷ , where x̂ is the vector w.r.t B and ŷ is the vector w.r.t. C

L2(7) April 8, 2021 45 / 56


Basis Change: General Case
• 7 W , consider bases B, B 0 of V and C , C 0 of W
For linear mapping Φ : V →
0 0 0 0 0 0
   
B = b1 · · · bn , B = b1 · · · bn C = c 1 · · · cm , C = c 1 · · · cm .
• (inter) transformation matrices AΦ from B to C and A0Φ from B 0 to C 0
• (intra) transformation matrices S from B 0 to B and T from C 0 to C
• Theorem. A0Φ = T −1 AΦ S

L2(7) April 8, 2021 46 / 56


Image and Kernel
• Consider a linear mapping Φ : V 7→ W . The kernel (or null space) is the set of
vectors in V that maps to 0 ∈ W (i.e., neutral element).
Definition. ker(Φ) := Φ−1 (0W ) = {v ∈ V : Φ(v ) = 0W }
• Image/range: set of vectors w ∈ W that can be reached by Φ from any vector in V
• V : domain, W : codomain

L2(7) April 8, 2021 47 / 56


Image and Kernel: Properties

• 0V ∈ ker(Φ) (because Φ(0V ) = 0W )

• Both Im(Φ) and ker(Φ) are subspaces of W and V , respectively.

• Φ is one-to-one (injective) ⇐⇒ ker(Φ) = {0} (i.e., only 0 is mapped to 0)

• Since Φ is a linear mapping, there exists A ∈ Rm×n such that Φ : x 7→ Ax. Then,
Im(Φ) = column space of A which is the span of column vectors of A.

• rk(A) = dim(Im(Φ))

• ker(Φ) is the solution set of the homogeneous system of linear equations Ax = 0

L2(7) April 8, 2021 48 / 56


Rank-Nullity Theorem
Theorem.
dim(ker(Φ)) + dim(Im(Φ)) = dim(V )
• If dim(Im(Φ)) < dim(V ), the kernel contains more than just 0.
• If dim(Im(Φ)) < dim(V ), AΦ x = 0 has infinitely many
solutions.
• If dim(V ) = dim(W ) (e.g., V = W = Rn ), the followings are
equivalent: Φ is
◦ (1) injective, (2) surjective, (3) bijective,

◦ In this case, Φ defines y = Ax, where A is regular.

• Simplified version. For A ∈ Rm×n ,

rk(A) + nullity(A) = n

2
Nullity: the dimension of null space (kernel)
L2(7) April 8, 2021 49 / 56
Roadmap

(1) Systems of Linear Equations


(2) Matrices
(3) Solving Systems of Linear Equations
(4) Vector Spaces
(5) Linear Independence
(6) Basis and Rank
(7) Linear Mappings
(8) Affine Spaces

L2(8) April 8, 2021 50 / 56


Linear vs. Affine Function

• linear function: f (x) = ax


• affine function: f (x) = ax + b
• sometimes (ignorant) people refer to
affine functions as linear

L2(8) April 8, 2021 51 / 56


Affine Subspace

• Spaces that are offset from the origin. Not a vector space.
• Definition. Consider a vector space V , x0 ∈ V and a subspace U ⊂ V . Then, the
subset L = x0 + U := {x0 + u : u ∈ U} is called affine subspace or linear manifold
of V .
• U is called direction or direction space, and x0 is support point.
• An affine subspace is not a vector subspace of V for x0 ∈
/ U.
• Parametric equation. A k-dimensional affine space L = x0 + U. If (b1 , . . . , bk ) is
an ordered basis of U, any element x ∈ L can be uniquely described as
x = x0 + λ1 b1 + · · · + λk bk , λ1 , . . . , λk ∈ R

L2(8) April 8, 2021 52 / 56


Example
• In R2 , one-dimensional affine subspace: line. y = x0 + λb1 . U = span[b1 ]
• In R3 , two-dimensional affine subspace: plane. y = x0 + λ1 b1 + λ2 b2 . U = span[b1 , b2 ]
Pn−1
• In Rn , (n − 1)-dimensional affine subspace: hyperplane. y = x0 +
k=1 λi bi .
U = span[b1 , . . . , bn ]

• For a linear mapping Φ : V 7→ W and a vector a ∈ W , the mapping φ : V 7→ W with


φ(x) = a + Φ(x) is an affine mapping from V to W . The vector a is called the translation
vector.

L2(8) April 8, 2021 53 / 56


Questions?

L2(8) April 8, 2021 54 / 56


Review Questions

1)

L2(8) April 8, 2021 55 / 56

You might also like