Column space, left kernel, and their bases
This work by Jephian Lin is licensed under a Creative Commons Attribution 4.0 International License.
$\newcommand{\trans}{^\top} \newcommand{\adj}{^{\rm adj}} \newcommand{\cof}{^{\rm cof}} \newcommand{\inp}[2]{\left\langle#1,#2\right\rangle} \newcommand{\dunion}{\mathbin{\dot\cup}} \newcommand{\bzero}{\mathbf{0}} \newcommand{\bone}{\mathbf{1}} \newcommand{\ba}{\mathbf{a}} \newcommand{\bb}{\mathbf{b}} \newcommand{\bc}{\mathbf{c}} \newcommand{\bd}{\mathbf{d}} \newcommand{\be}{\mathbf{e}} \newcommand{\bh}{\mathbf{h}} \newcommand{\bp}{\mathbf{p}} \newcommand{\bq}{\mathbf{q}} \newcommand{\br}{\mathbf{r}} \newcommand{\bx}{\mathbf{x}} \newcommand{\by}{\mathbf{y}} \newcommand{\bz}{\mathbf{z}} \newcommand{\bu}{\mathbf{u}} \newcommand{\bv}{\mathbf{v}} \newcommand{\bw}{\mathbf{w}} \newcommand{\tr}{\operatorname{tr}} \newcommand{\nul}{\operatorname{null}} \newcommand{\rank}{\operatorname{rank}} %\newcommand{\ker}{\operatorname{ker}} \newcommand{\range}{\operatorname{range}} \newcommand{\Col}{\operatorname{Col}} \newcommand{\Row}{\operatorname{Row}} \newcommand{\spec}{\operatorname{spec}} \newcommand{\vspan}{\operatorname{span}} \newcommand{\Vol}{\operatorname{Vol}} \newcommand{\sgn}{\operatorname{sgn}} \newcommand{\idmap}{\operatorname{id}} \newcommand{\am}{\operatorname{am}} \newcommand{\gm}{\operatorname{gm}} \newcommand{\mult}{\operatorname{mult}} \newcommand{\iner}{\operatorname{iner}}$
from lingeo import random_good_matrix, column_space_matrix, left_kernel_matrix
You are recommended to read the section Four fundamental subspaces first, where you will find the definition of $\beta_R$, $\beta_K$, $\beta_C$, $\beta_L$.
Let $A$ be a matrix.
Let $\beta_C$ and $\beta_L$ be the standard bases of $\Col(A)$ and $\ker(A\trans)$, respectively.
We have known that
In fact, both $\beta_C$ and $\beta_L$ are linearly independent.
Therefore, it is fine that we call them the standard bases.
When $S$ is a finite set of vectors and $V = \vspan(S)$, we may find a basis of $V$ as follows.
Thus, $\beta_C$ is a basis of $V$.
Let $\bb$ be a vector in $\mathbb{R}^n$.
Let $V$ be a subspace in $\mathbb{R}^n$ spanned by a finite set of vectors $S$.
Then we may find the projection of $\bb$ onto $V$ as follows.
執行下方程式碼。
令 $R$ 為 $A$ 最簡階梯形式矩陣。
令 $S= \{ \bu_1, \ldots, \bu_5 \}$ 為 $A$ 的各行向量且 $V = \vspan(S)$。
Run the code below. Let $R$ be the reduced echelon form of $A$. Let $S= \{ \bu_1, \ldots, \bu_5 \}$ be the columns of $A$ and $V = \vspan(S)$.
### code
set_random_seed(0)
print_ans = False
m,n,r = 3,5,2
A, R, pivots = random_good_matrix(m,n,r, return_answer=True)
C = column_space_matrix(A)
print("A =")
show(A)
print("R =")
show(R)
if print_ans:
print("A basis of V can be the set of columns of")
show(C)
free = [i for i in range(n) if i not in pivots]
for f in free:
print("u%s = "%(f+1) + " + ".join("%s u%s"%(R[j,f], pivots[j]+1) for j in range(r)) )
把每個 $S$ 中不在基底裡的向量
寫成基底的線性組合。
For any vector in $S$ but not in your basis in the previous problem, write it as a linear combination of the basis.
執行以下程式碼。
其中 $\left[\begin{array}{c|c} R & B \end{array}\right]$ 是 $\left[\begin{array}{c|c} A & I \end{array}\right]$ 的最簡階梯形式矩陣。
Run the code below. Let $\left[\begin{array}{c|c} R & B \end{array}\right]$ be the reduced echelon form $\left[\begin{array}{c|c} A & I \end{array}\right]$.
### code
set_random_seed(0)
print_ans = False
m,n,r = 4,5,2
A = random_good_matrix(m,n,r)
AI = A.augment(identity_matrix(m), subdivide=True)
RB = AI.rref()
print("[ A | I ] =")
show(AI)
print("[ R | B ] =")
show(RB)
if print_ans:
print("A basis of the column space can be the set of columns of")
show(column_space_matrix(A))
print("A basis of the left kernel can be the set of rows of")
show(left_kernel_matrix(A))
令
$$ A = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 \\ -1 & 1 & 1 & 0 & 0 \\ 0 & -1 & 0 & 1 & 0 \\ 0 & 0 & -1 & -1 & 1 \\ 0 & 0 & 0 & 0 & -1 \\ \end{bmatrix} $$而 $S = \{ \bu_1, \ldots, \bu_5 \}$ 為 $A$ 的各行向量。
集合 $S$ 有 $\binom{5}{4} = 5$ 個大小為 $4$ 的子集。
把這些子集分類﹐哪些是行空間的基底?哪些不是?
Let
$$ A = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 \\ -1 & 1 & 1 & 0 & 0 \\ 0 & -1 & 0 & 1 & 0 \\ 0 & 0 & -1 & -1 & 1 \\ 0 & 0 & 0 & 0 & -1 \\ \end{bmatrix} $$and $S = \{ \bu_1, \ldots, \bu_5 \}$ the columns of $A$. The set $S$ contains $\binom{5}{4} = 5$ subsets of size $3$. For each of them, determine if it is a basis of $\Col(A)$ or not.
令 $A$ 為一矩陣而 $R$ 為其最簡階梯形式矩陣。
考慮 $R$ 的軸﹐
令 $A_p$ 為 $A$ 中對應到軸的那些行所組成的矩陣、
令 $R_p$ 為 $R$ 中對應到軸的那些行所組成的矩陣。
(所以 $A_p$ 的各行向量就是 $\beta_C$。)
依序證明 $\ker(R_p) = \{ \bzero \}$ 以及 $\ker(A_p) = \{ \bzero \}$﹐
最後得到 $\beta_C$ 是線性獨立的。
Let $A$ be a matrix and $R$ its reduced echelon form. Let $A_p$ be the submatrix of $A$ induced on the columns corresponding to the pivots of $R$. Let $R_p$ be the submatrix of $R$ induced on the columns of its pivots. (Therefore, the columns of $A_p$ form $\beta_C$.)
Show that $\ker(R_p) = \{ \bzero \}$ and $\ker(A_p) = \{ \bzero \}$. Use these facts to prove that $\beta_C$ is linearly independent.
若 $S = \{ \bu_1, \ldots, \bu_k \}$ 為一群 $\mathbb{R}^n$ 中的向量。
在某些比較簡單的狀況下﹐我們可以用以下的過程來說明 $S$ 是一個線性獨立集。
這個方法稱作 zero forcing 。
Let $S = \{ \bu_1, \ldots, \bu_k \}$ be some vectors in $\mathbb{R}^n$. In some special cases, we may use the following process to show that $S$ is linearly independent.
This process is called zero forcing .
利用 zero forcing 的方法﹐
說明
的列向量集合是線性獨立的。
Use zero forcing to show that the rows of
$$ A = \begin{bmatrix} 1 & 0 & 0 & 0 & 1 \\ 1 & 1 & 1 & 0 & 1 \\ 1 & 1 & 1 & 1 & 1 \\ 1 & 1 & 0 & 0 & 1 \\ \end{bmatrix} $$form a linearly independent set.
說明
$$ A = \begin{bmatrix} 1 & 1 & 1 & 1 \\ 1 & 2 & 4 & 8 \\ 1 & 3 & 9 & 27 \\ \end{bmatrix} $$的列向量集合是線性獨立的﹐
但 zero forcing 的方法並不適用。
Verify that the row of
$$ A = \begin{bmatrix} 1 & 1 & 1 & 1 \\ 1 & 2 & 4 & 8 \\ 1 & 3 & 9 & 27 \\ \end{bmatrix} $$form a linearly independent set, but zero forcing fails to show the independence.
利用 zero forcing 的方法來說明 $\beta_L$ 是線性獨立的。
Use zero forcing to show that $\beta_L$ is linearly independent.
令
$$ A = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 \\ -1 & 1 & 1 & 0 & 0 \\ 0 & -1 & 0 & 1 & 0 \\ 0 & 0 & -1 & -1 & 1 \\ 0 & 0 & 0 & 0 & -1 \\ \end{bmatrix} $$而 $S = \{ \bu_1, \ldots, \bu_5 \}$ 為 $A$ 的各行向量。
令 $A'$ 是把 $A$ 中第一行和第四行互換所得的矩陣。
因此 $\Col(A) = \Col(A')$。
計算 $A$ 和 $A'$ 各自算出來的 $\beta_C$。
它們一樣嗎?
(這個例子說明如果想把確保某一個非零向量有被選到基底中﹐
只要把它放在第一行再做高斯消去法就好。)
Let
$$ A = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 \\ -1 & 1 & 1 & 0 & 0 \\ 0 & -1 & 0 & 1 & 0 \\ 0 & 0 & -1 & -1 & 1 \\ 0 & 0 & 0 & 0 & -1 \\ \end{bmatrix} $$and $S = \{ \bu_1, \ldots, \bu_5 \}$ the columns of $A$. Let $A'$ be the matrix obtained from $A$ by switching the first column and the fourth column. Thus, $\Col(A) = \Col(A')$. Find the $\beta_C$ for $A$ and for $A'$, respectively. Are they the same?
This example shows that if we wish a column to be selected into the basis of the column space, then we may put it as the first column and then run the Gaussian elimination.