Matrix Representation Theorem

We noted that multiplication by an m x n matrix defines a transformation from 𝔽n into 𝔽m. Matrix transformations are linear transformations since the properties in Definition 2.1.1 are simply restatement of the linearity properties for matrix multiplication. The matrix representation theorem says that every linear transformation from 𝔽n into 𝔽m is a matrix transformation.
Just like every vector in a finite-dimensional vector space can be associated with a vector in 𝔽n, every linear transformation between vector spaces can be associated with a matrix in Mm,n(𝔽):

L: 𝔽n ⟶ 𝔽m
L: xAx

Theorem 3.3.1. (Matrix Representation Theorem) Let V be an n-dimensional vector space over 𝔽 and W be m-dimensional vector space over 𝔽. Let B = {u1, u2, ..., un} and B' = {w1, w2, ..., wm} be ordered bases for V and W respectively. Let T: VW be any linear transformation. Then there is a unique matrix A such that T(x) = Ax for all XV. Then there exists a natural isomorpshim J: L(V,W) ⟶ Mm,n(𝔽) given by J(T) = A.

Proof. Let xV then x = x1 u1 + x2 u2 + ... + xnun, where uj is the jth element of the basis of V.
Applying the additivity and scalar properties of T, we obtain

T(x) = x1 T(u1) + x2 T(u2) + ...+ xn T(un)

The vectors T(uj) ∈ W, can be written as linear combination of the vectors of the basis B' of W:

T(u1) = A11 w1 + A21 w2 + ... + Am1 wm
T(u3) = A12 w1 + A22 w2 + ... + Am2 wm
...
T(un) = A1n w1 + A2n w2 + ... + Amn wm

Clearly these mn scalar Aij, 1 ≤ im, 1 ≤ jn determine T completely.

The m x n matrix A = (Aij) whose j-th column is the coordinate matrix of T(uj) relative to ordered basis B' is called the matrix of T relative to the pair of ordered basis B and B'. It is denoted by [T]B,B'. Thus

[ T ] B , B = ( A 11 A 12 A 1 n A 21 A 22 A 2 n A m 1 A m 2 A m n )

Then we can express the linear transformation as T(x) = [T]B,B [x]B = Ax, since

T ( x ) = T ( j = 1 n x j u j ) = j = 1 n x j T ( u j ) = j = 1 n x j i = 1 m A i j w i = j = 1 n i = 1 m A i j x j w i = i = 1 m ( j = 1 n A i j x j ) w i

from which is evident that the scalar components of L(x), with respect to the basis B' are given by

y i = j = 1 n A i j x j

or in matrix form as

( y 1 y 2 y m ) = A ( x 1 x 2 x n )

Conversely suppose that A = (Aij) be any given m x n matrix. Define T: VW by

T ( j = 1 n x j u j ) = i = 1 n ( j = 1 n A i j x j ) w i

Then T, is a linear transformation such that the properties in definition 2.1.1 hold:

  1. Let aF then

    T ( ax ) = i = 1 n ( a ( j = 1 n A i j x j ) ) w i

    Hence

    a[T]B' = aTB,B' [x]B

  2. Let S be any other linear transformation then

    [(T + S)(x)]B' = ([TB,B'] + [SB,B']) [x]B

This completely proves the theorem.  □

Example 3.3.2. Consider the linear transformation T: ℝ3 ⟶ ℝ2 given by

T(x,y,z) = (x + y + z, xy)

Consider B = {(1,0,0),(0,1,0),(0,0,1)} and B' = {(1,0), (0,1)} be standard basis of ℝ3 and ℝ2 respectively. Then

T (1,0,0) = (1,0) = 1 (1,0) + 0(0,1)
T (0,1,0) = (1,−1) = 1 (1,0) − 1(0,1)
T (0,0,0) = (1,0) = 1 (1,0) − 0(0,1)

Hence [T(1,0,0)]B = (1 0)T,   [T(0,1,0)]B = (1 −1)T,   [T(0,0,1)]B = (1 0)T. Thus the matrix T relative to B and B' is

[ T ] B , B = ( 1 1 1 0 1 0 )  ■
«Isomorpshims Index Invertible linear transformations »