Gram–Schmidt orthogonalization
This work by Jephian Lin is licensed under a Creative Commons Attribution 4.0 International License.
$\newcommand{\trans}{^\top} \newcommand{\adj}{^{\rm adj}} \newcommand{\cof}{^{\rm cof}} \newcommand{\inp}[2]{\left\langle#1,#2\right\rangle} \newcommand{\dunion}{\mathbin{\dot\cup}} \newcommand{\bzero}{\mathbf{0}} \newcommand{\bone}{\mathbf{1}} \newcommand{\ba}{\mathbf{a}} \newcommand{\bb}{\mathbf{b}} \newcommand{\bc}{\mathbf{c}} \newcommand{\bd}{\mathbf{d}} \newcommand{\be}{\mathbf{e}} \newcommand{\bh}{\mathbf{h}} \newcommand{\bp}{\mathbf{p}} \newcommand{\bq}{\mathbf{q}} \newcommand{\br}{\mathbf{r}} \newcommand{\bx}{\mathbf{x}} \newcommand{\by}{\mathbf{y}} \newcommand{\bz}{\mathbf{z}} \newcommand{\bu}{\mathbf{u}} \newcommand{\bv}{\mathbf{v}} \newcommand{\bw}{\mathbf{w}} \newcommand{\tr}{\operatorname{tr}} \newcommand{\nul}{\operatorname{null}} \newcommand{\rank}{\operatorname{rank}} %\newcommand{\ker}{\operatorname{ker}} \newcommand{\range}{\operatorname{range}} \newcommand{\Col}{\operatorname{Col}} \newcommand{\Row}{\operatorname{Row}} \newcommand{\spec}{\operatorname{spec}} \newcommand{\vspan}{\operatorname{span}} \newcommand{\Vol}{\operatorname{Vol}} \newcommand{\sgn}{\operatorname{sgn}} \newcommand{\idmap}{\operatorname{id}} \newcommand{\am}{\operatorname{am}} \newcommand{\gm}{\operatorname{gm}} \newcommand{\mult}{\operatorname{mult}} \newcommand{\iner}{\operatorname{iner}}$
from lingeo import random_good_matrix
from linspace import QR, projection_on_vector
Suppose $U$ is an inner product space.
Then every subspace of $U$ has an orthonormal basis.
Indeed, there is an algorithm for converting a basis into an orthonormal basis.
This process is referred to as orthogonalization.
Let $V$ be a subspace of $U$ and $S = \{\bu_1, \ldots, \bu_d\}$ a basis of $V$.
Taking $S$ as the input, the following steps generates an orthonormal basis $\hat{S} = \{\hat\bu_1, \ldots, \hat\bu_d\}$ such that $V = \vspan(S) = \vspan(\hat{S})$.
Then $\hat\bu_{k} = \bu_{k} - \operatorname{proj}_{V_{k-1}}(\bu_{k})$, where $\operatorname{proj}_{V_{k-1}}(\bu_k)$ is the projection of $\bu_k$ onto the subspace $V_{k-1}$.
Repeat this step for $k = 2,\ldots, d$.
4. Normalize each vector of $\{\hat\bu_1, \ldots, \hat\bu_d\}$ to length one.
This algorithm is called the Gram--Schmidt process.
Let $\operatorname{proj}_\bv(\bu)$ be the projection of $\bu$ onto the line spanned by $\bv$.
Notice here
$$\begin{aligned}
\bu_k &= \hat\bu_k + \operatorname{proj}_{V_{k-1}}(\bu_k) \\
&= \operatorname{proj}_{\hat\bu_1}(\bu_k) + \cdots + \operatorname{proj}_{\hat\bu_{k-1}}(\bu_k) + \hat\bu_k
\end{aligned}$$
since $\{\hat\bu_1,\ldots,\hat\bu_{k-1}\}$ is orthogonal.
Therefore, $\bu_k \in \vspan(S_k)$.
Let $Q$ be the matrix whose columns are vectors in $\hat{S}$
and $A$ the matrix whose columns are vectors in $S$.
Then $A = QR$ for some upper triangular matrix.
This decomposition is called the $QR$ decomposition of $A$.
Let $Q$ be an $m\times n$ matrix.
Then the following are equivalent:
A matrix $Q$ satisfying one of these conditions is called a column-orthogonal matrix.
Similarly, $Q$ is row-orthogonal if the set of rows of $Q$ is an orthonormal basis of $\mathbb{R}^n$.
When $Q$ is an $n\times n$ matrix, then $Q$ is column-orthogonal if and only if $Q$ is row-orthogonal.
Such a matrix is an orthogonal matrix.
A.QR()
執行以下程式碼。
令 $S = \{\bu_1,\bu_2,\bu_3\}$ 為 $A$ 中的各行向量。
利用以下的公式計算:
Run the code below. Let $S = \{\bu_1,\bu_2,\bu_3\}$ be the columns of $A$. Use the following formulas to accomplish the calculation.
### code
set_random_seed(0)
print_ans = False
m,n,r = 4,3,3
A = random_good_matrix(m,n,r,bound=1)
Q, R = QR(A)
u1,u2,u3 = A.transpose()
uh1 = u1
# uh2 = u2 - projection_on_vector(u2, uh1)
# uh3 = ...
print("A =")
show(A)
if print_ans:
print("u-hat_i are the columns of")
show(Q)
D = diagonal_matrix(q.norm() for q in Q.transpose())
print("An orthonormal basis can be the columns of")
show(Q * D.inverse())
print("The QR decomposition can be")
print("Q =")
show(Q * D.inverse())
print("R =")
show(D.inverse() * R)
計算 $\hat{S} = \{ \hat\bu_1, \hat\bu_2, \hat\bu_3\}$。
可以用 projection_on_vector(u, v)
來計算 $\operatorname{proj}_\bv(\bu)$。
Calculate $\hat{S} = \{ \hat\bu_1, \hat\bu_2, \hat\bu_3\}$. Note that you may use projection_on_vector(u, v)
to calculate $\operatorname{proj}_\bv(\bu)$.
判斷 $\hat{S}$ 是否垂直、是否單位長垂直。
如果不是單位長垂直﹐找出一組單位長垂直的基底。
Is $\hat{S}$ orthogonal? Is it orthonormal? If it is not orthonormal, then find the corresponding orthonormal set.
求出一個垂直矩陣 $Q$ 及一個上三角矩陣 $R$ 使得 $A = QR$。
Find an orthogonal matrix $Q$ and an upper triangular matrix $R$ such that $A = QR$.
求出以下空間的垂直基底。
(不一定要把向量縮為單位長。)
Find an orthogonal basis of the following spaces. (You do not need to normalize the vectors.)
求出 $V = \{(x,y,z) : x + y + z = 0\}$ 的一組垂直基底。
Find an orthogonal basis of the space $V = \{(x,y,z) : x + y + z = 0\}$.
求出 $V = \{(x,y,z,w) : x + y + z + w = 0\}$ 的一組垂直基底。
Find an orthogonal basis of the space $V = \{(x,y,z,w) : x + y + z + w = 0\}$.
令 $S = \{\bu_1,\bu_2,\bu_3\}$ 為向量空間 $V$ 的一組基底﹐且 $A$ 的各行向量為 $S$。
令 $\hat{S} = \{ \hat\bu_1, \hat\bu_2, \hat\bu_3\}$ 為 $V$ 的一組單位長垂直基底﹐且 $Q$ 的各行向量為 $\hat{S}$。
已知
求上三角矩陣 $R$ 使得 $A = QR$。
Let $S = \{\bu_1,\bu_2,\bu_3\}$ be a basis of the vector space $V$ and $A$ the matrix whose columns are vectors in $S$. Let $\hat{S} = \{ \hat\bu_1, \hat\bu_2, \hat\bu_3\}$ be an orthonormal basis of $V$ and $Q$ the matrix whose columns are vectors in $\hat{S}$. Suppose
$$ \begin{aligned} \bu_1 &= \hat\bu_1, \\ \bu_2 &= 2\hat\bu_1 + 3\hat\bu_2, \\ \bu_3 &= 4\hat\bu_1 + 5\hat\bu_2 + 6\hat\bu_3.\\ \end{aligned} $$Find an upper triangular matrix $R$ such that $A = QR$.
因此對 $\Col(A)$ 和對 $\Col(Q)$ 的投影應該是一樣的。
說明 $A(A\trans A)^{-1}A\trans = Q(Q\trans Q)^{-1}Q\trans$。
Consequently, for any vector, the projection onto $\Col(A)$ is the same as the projection onto $\Col(Q)$. Show that $A(A\trans A)^{-1}A\trans = Q(Q\trans Q)^{-1}Q\trans$.
寫一個程式﹐
讓輸入矩陣 $A$ 後會得出它的 $QR$ 分解。
Write a program that takes an input matrix $A$ and generates its $QR$ decomposition.