Skip to main content

Section 5.3 Isomorphism and invertible matrix

We show that a matrix representation of an isomorphism is invertible. We keep the notations of Subsection 4.7.2.
Let \(V\) and \(W\) be finite-dimensional vector spaces over a field \(F\text{.}\) Suppose that \(\mathfrak{B}_V=(v_1,v_2,\ldots,v_n)\) and \(\mathfrak{B}_W=(w_1,w_2,\ldots,w_n)\) are ordered bases of \(V\) and \(W\text{,}\) respectively. Let \(T\colon V\to W\) be an \(F\)-linear isomorphism. Let the matrix of \(T\) relative to ordered bases \(\mathfrak{B}_V\) and \(\mathfrak{B}_W\) be
\begin{equation*} [T]_{\mathfrak{B}_V}^{\mathfrak{B}_W}=\begin{pmatrix}\beta_{11}\amp\beta_{12}\amp\cdots\amp\beta_{1n}\\\beta_{21}\amp\beta_{22}\amp\cdots\amp\beta_{2n}\\\vdots\amp\vdots\amp\ddots\amp\vdots\\\beta_{n1}\amp\beta_{n2}\amp\cdots\amp\beta_{nn}\end{pmatrix}. \end{equation*}
Recall from Subsection 4.7.2 that \(T(v_i)=\sum_k\beta_{ki}w_k\text{.}\)
By Lemma 5.1.2, the set-theoretic inverse of the map, \(T^{-1}\) is also \(F\)-linear. Suppose that
\begin{equation*} T^{-1}(w_j)=\sum_\ell\gamma_{\ell j}v_\ell. \end{equation*}
The matrix of \(T^{-1}\) relative to ordered bases \(\mathfrak{B}_W\) and \(\mathfrak{B}_V\) is
\begin{equation*} [T^{-1}]_{\mathfrak{B}_W}^{\mathfrak{B}_V}=\begin{pmatrix}\gamma_{11}\amp\gamma_{12}\amp\cdots\amp\gamma_{1n}\\\gamma_{21}\amp\gamma_{22}\amp\cdots\amp\gamma_{2n}\\\vdots\amp\vdots\amp\ddots\amp\vdots\\\gamma_{n1}\amp\gamma_{n2}\amp\cdots\amp\gamma_{nn}\end{pmatrix}. \end{equation*}
By Remark 4.1.5, the composition \(T\circ T^{-1}=I_W\) is determined by its action on \(\{w_j\}\text{.}\) We have the following.
\begin{align*} w_j=T\circ T^{-1}(w_j)\amp=T\big(\sum_\ell\gamma_{\ell j}v_\ell\big)\\ \amp=\sum_\ell\gamma_{\ell j}T(v_\ell)\\ \amp=\sum_\ell\gamma_{\ell j}\big(\sum_k\beta_{k\ell}w_k\big)\\ \amp=\sum_k\big(\sum_\ell\gamma_{\ell j}\beta_{k\ell}\big)w_k \end{align*}
By Exercise 3.5.11, we must have
\begin{equation*} \sum_\ell\gamma_{\ell j}\beta_{k\ell}=0\quad\text{for }k\neq j\quad\text{and}\quad\sum_\ell\gamma_{\ell j}\beta_{j\ell}=1. \end{equation*}
We thus obtain that the \(j\)-th column of the following matrix
\begin{equation*} \begin{pmatrix}\beta_{11}\amp\beta_{12}\amp\cdots\amp\beta_{1n}\\\beta_{21}\amp\beta_{22}\amp\cdots\amp\beta_{2n}\\\vdots\amp\vdots\amp\ddots\amp\vdots\\\beta_{n1}\amp\beta_{n2}\amp\cdots\amp\beta_{nn}\end{pmatrix}\cdot\begin{pmatrix}\gamma_{11}\amp\gamma_{12}\amp\cdots\amp\gamma_{1n}\\\gamma_{21}\amp\gamma_{22}\amp\cdots\amp\gamma_{2n}\\\vdots\amp\vdots\amp\ddots\amp\vdots\\\gamma_{n1}\amp\gamma_{n2}\amp\cdots\amp\gamma_{nn}\end{pmatrix} \end{equation*}
is \(\big(0,0,\ldots,0,1,0,\ldots,0\big)^t\text{.}\) Therefore, \([T]_{\mathfrak{B}_V}^{\mathfrak{B}_W}\cdot[T^{-1}]_{\mathfrak{B}_W}^{\mathfrak{B}_V}\) is the \(n\times n\) identity matrix.
Similar computations for \(T^{-1}\circ T=I_V\) yield \([T^{-1}]_{\mathfrak{B}_W}^{\mathfrak{B}_V}\cdot[T]_{\mathfrak{B}_V}^{\mathfrak{B}_W}\) is the \(n\times n\) identity matrix.
Let \(A\in M_n(F)\) be an invertible matrix and \(B\) be its inverse. Consider the linear maps induced by \(A\text{,}\) \(L_A:x\mapsto Ax\) and induced by \(B\text{,}\) \(L_B:y\mapsto By\text{.}\) Thus,
\begin{equation*} L_A\circ L_B(y)=A(By)=y\quad\text{and}\quad L_B\circ L_A(x)=B(Ax)=x. \end{equation*}
Therefore, \(L_A\) as well as \(L_B\) are isomorphisms.
Let \(v=a_1v_1+\cdots+a_nv_n\text{,}\) i.e., the coordinates of \(v\) with respect to \(\mathfrak{B}^\prime\) are \((a_1,\ldots,a_n)\text{.}\) Suppose that
\begin{equation*} v_i=\sum_j\alpha_{ji}u_j. \end{equation*}
We thus have the following.
\begin{align*} v\amp=\sum_ia_iv_i\\ \amp=\sum_ia_i\sum_j\alpha_{ji}u_j\\ \amp=\sum_j\big(\sum_ia_i\alpha_{ji}\big)u_j \end{align*}
By Exercise 3.5.11, the coordinates of \(v\) with respect to the basis \(\mathfrak{B}\) are
\begin{equation*} (\sum_ia_i\alpha_{1i},\ldots,\sum_ia_i\alpha_{ni}). \end{equation*}
Put \(P=(\alpha_{ij})\) Using (5.2.1) we therefore have the following.
\begin{equation*} L_P\circ{}_nT\circ T_{\mathfrak{B}^\prime}(v)={}_nT\circ T_{\mathfrak{B}}(v). \end{equation*}
Since, \({}_nT\) and \(T_{\mathfrak{B}^\prime}\) are \(F\)-isomorphisms (see Proposition 5.1.9 and Checkpoint 5.1.8), the \(F\)-linear map \(L_P\circ{}_nT\circ T_{\mathfrak{B}^\prime}\) is also an \(F\)-isomorphism.. Moreover the matrix of \(L_P\circ{}_nT\circ T_{\mathfrak{B}^\prime}\) with respect to \(\mathfrak{B}\text{,}\) \([L_P\circ{}_nT\circ T_{\mathfrak{B}^\prime}]_{\mathfrak{B}}^{\mathfrak{B}}=P\text{.}\) By Proposition 5.3.1, \(P\) is invertible and thus we get the result.

Remark 5.3.4.

Note that the matrix \(P\) in Lemma 5.3.3 is the matrix of the identity map \(\unit_V\) with respect to bases \(\mathfrak{B}^\prime\) and \(\mathfrak{B}\) in that order, i.e.,
\begin{equation*} [\unit_V]_{\mathfrak{B}^\prime}^{\mathfrak{B}}=P. \end{equation*}
We consider the following composition of maps with bases considered written in brackets.
\begin{equation*} (V,\mathfrak{B}_1)\xrightarrow{\unit_V}(V,\mathfrak{B}_2)\xrightarrow{T}(W,\mathfrak{C}_2)\xrightarrow{\unit_W}(W,\mathfrak{C}_1) \end{equation*}
We have \(T=\unit_W\circ T\circ\unit_V\text{.}\) By Exercise 4.8.5 we get the result.
As a special case of the above theorem we obtain the following result.
Suppose that \(\dim_FV=n\) and \(\dim_FW=m\text{.}\) By Rank-Nullity Theorem (see Theorem 4.5.3), \(\dim_F\ker(T)=n-r\text{.}\) Let \(\{v_1,v_2,\ldots,v_{n-r}\}\) be a basis for \(\ker(T)\text{.}\) We extend it to a basis of \(V\text{,}\) say \(\{v_1,\ldots,v_{n-r},u_1,\ldots,u_r\}\text{.}\) By Proposition 3.7.6, we have
\begin{equation} V=\ker(T)\bigoplus\langle u_1,\ldots,u_r\rangle.\tag{5.3.1} \end{equation}
We claim that \(\{T(u_1),\ldots,T(u_r)\}\) is a basis of \(\Im(T)\text{.}\) Indeed, suppose that \(\sum_i\alpha_iT(u_i)=0\text{.}\) Hence, \(\sum\alpha_iu_i\in\ker(T)\) . By eq. (5.3.1) and linear independence of \(\{u_1,\ldots,u_r\}\text{,}\) we have \(\alpha_i=0\) for every \(i\text{.}\)
By Proposition 3.7.6, we write
\begin{equation*} W=\langle T(u_1),\ldots,T(u_r)\rangle\bigoplus W^\prime. \end{equation*}
Let \(\{w_1,\ldots,w_{m-r}\}\) be a basis of \(W^\prime\text{.}\)
Consider the ordered bases \((u_1,\ldots,u_r,v_1,\ldots,v_{n-r})\) and \(\big(T(u_1),\ldots,T(u_r),w_1,\ldots,w_{m-r}\big)\) of \(V\) and \(W\text{,}\) respectively. The matrix of \(T\) with respect to these bases is
\begin{equation*} \begin{pmatrix}I_r\amp 0\\0\amp 0\end{pmatrix}. \end{equation*}
Hence the theorem is proved.

Remark 5.3.9.

The above corollary should remind the reader of the following result. A square matrix over a field is invertible if and only if it is a product of elementary matrices.