特徵多項式¶

Characteristic polynomial

Creative Commons License
This work by Jephian Lin is licensed under a Creative Commons Attribution 4.0 International License.

$\newcommand{\trans}{^\top} \newcommand{\adj}{^{\rm adj}} \newcommand{\cof}{^{\rm cof}} \newcommand{\inp}[2]{\left\langle#1,#2\right\rangle} \newcommand{\dunion}{\mathbin{\dot\cup}} \newcommand{\bzero}{\mathbf{0}} \newcommand{\bone}{\mathbf{1}} \newcommand{\ba}{\mathbf{a}} \newcommand{\bb}{\mathbf{b}} \newcommand{\bc}{\mathbf{c}} \newcommand{\bd}{\mathbf{d}} \newcommand{\be}{\mathbf{e}} \newcommand{\bh}{\mathbf{h}} \newcommand{\bp}{\mathbf{p}} \newcommand{\bq}{\mathbf{q}} \newcommand{\br}{\mathbf{r}} \newcommand{\bx}{\mathbf{x}} \newcommand{\by}{\mathbf{y}} \newcommand{\bz}{\mathbf{z}} \newcommand{\bu}{\mathbf{u}} \newcommand{\bv}{\mathbf{v}} \newcommand{\bw}{\mathbf{w}} \newcommand{\tr}{\operatorname{tr}} \newcommand{\nul}{\operatorname{null}} \newcommand{\rank}{\operatorname{rank}} %\newcommand{\ker}{\operatorname{ker}} \newcommand{\range}{\operatorname{range}} \newcommand{\Col}{\operatorname{Col}} \newcommand{\Row}{\operatorname{Row}} \newcommand{\spec}{\operatorname{spec}} \newcommand{\vspan}{\operatorname{span}} \newcommand{\Vol}{\operatorname{Vol}} \newcommand{\sgn}{\operatorname{sgn}} \newcommand{\idmap}{\operatorname{id}} \newcommand{\am}{\operatorname{am}} \newcommand{\gm}{\operatorname{gm}} \newcommand{\mult}{\operatorname{mult}} \newcommand{\iner}{\operatorname{iner}}$

In [ ]:
from lingeo import random_int_list, random_good_matrix

Main idea¶

We have seen the power of matrix diagonalization and how it relies on the equation

$$ A\bv = \lambda \bv. $$

To find the eigenvalues and the eigenvectors, we may may rewrite the equation as $A\bv = \lambda I\bv$ and obtain

$$ (A - \lambda I) \bv = \bzero, $$

which means

  • $\lambda$ is an eigenvalue if and only if $A - \lambda I$ is singular; and
  • when $\lambda$ is an eigenvalue, then $\bv$ can be any nonzero vector in $\ker(A - \lambda I)$.

We also know the determinant can be used to detect singularity.
Therefore, $\lambda$ is an eigenvalue of $A$ if and only if $\det(A - \lambda I) = 0$.

If we compute $p_A(x) = \det(A - x I)$, then $\lambda$ is an eigenvalue of $A$ if and only if $\lambda$ is a root of $p_A(x)$.

We call $p_A(x)$ the characteristic polyonimal of $A$ and the multiset of its roots as the spectrum of $A$, denoted by $\spec(A)$.

Not that even if $A$ is a real matrix, $\spec(A)$ might contains complex numbers.

Note that if $A$ and $B$ are similar by $B = Q^{-1}AQ$, then $p_A(x) = p_B(x)$ since

$$ \begin{aligned} p_B(x) &= \det(Q^{-1}AQ - xI) \\ &= \det(Q^{-1}AQ - Q^{-1}(xI)Q) \\ &= \det(Q^{-1}(A - xI)Q) \\ &= \det(Q^{-1})\det(A - xI)\det(Q) \\ &= p_A(x). \end{aligned} $$

Therefore, the characteristic polynomial of a linear function $f:V\rightarrow V$ is $p_f(x) = \det([f]_\beta^\beta - xI)$ for any basis $\beta$ of $V$, while the multiset of its roots is called the spectrum of $f$, denoted by $\spec(f)$.

Side stories¶

  • continuity argument
  • differentiation

Experiments¶

Exercise 1¶

執行以下程式碼。

Run the code below.

In [ ]:
### code
set_random_seed(0)
print_ans = False

n = 2
spec = random_int_list(n, 3)
D = diagonal_matrix(spec)
Q = random_good_matrix(n,n,n)
A = Q * D * Q.inverse()

pretty_print(LatexExpr("A ="), A)

if print_ans:
    pA = (-1)^n * A.charpoly()
    print("characteristic polyomial =", pA)
    print("spectrum is the set { " + ", ".join("%s"%val for val in spec) + " }")
Exercise 1(a)¶

計算 $A$ 的特徵多項式。

Find the characteristic polynomial of $A$.

Exercise 1(b)¶

計算 $\spec(A)$。

Find $\spec(A)$.

Exercises¶

Exercise 2¶

令

$$ A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} $$

且 $\bv = (x, y)$。
把 $A\bv = \lambda \bv$ 和 $(A - \lambda I)\bv$ 分別寫成 $x,y$ 的聯立方程組,
並說明它們等價。

Let

$$ A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} $$

and $\bv = (x, y)$. Then write each of $A\bv = \lambda \bv$ and $(A - \lambda I)\bv$ as a system of linear equations. Show that they are equivalent.

Exercise 3¶

計算以下矩陣 $A$ 的特徵多項式以及 $\spec(A)$。

Find the characteristic polynomial and $\spec(A)$ for each of the following matrices $A$.

Exercise 3(a)¶
$$ A = \begin{bmatrix} 5 & -1 \\ -1 & 5 \end{bmatrix}. $$
Exercise 3(b)¶
$$ A = \begin{bmatrix} 0 & 1 \\ -6 & 5 \end{bmatrix}. $$
Exercise 3(c)¶
$$ A = \begin{bmatrix} 0 & 1 \\ -4 & 0 \end{bmatrix}. $$
Exercise 3(d)¶
$$ A = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}. $$
Exercise 4¶

計算以下矩陣 $A$ 的特徵多項式以及 $\spec(A)$。

Find the characteristic polynomial and $\spec(A)$ for each of the following matrices $A$.

Exercise 4(a)¶
$$ A = \begin{bmatrix} 4 & 0 & -1 \\ 0 & 4 & -1 \\ -1 & -1 & 5 \end{bmatrix}. $$
Exercise 4(b)¶
$$ A = \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 6 & -11 & 6 \end{bmatrix}. $$
Exercise 4(c)¶
$$ A = \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \end{bmatrix}. $$
Exercise 5¶

令 $J_n$ 為 $n\times n$ 的全 $1$ 矩陣。
求 $J_n$ 的特徵多項式及 $\spec(J_n)$。

提示:先用列運算把所有列加到第一列。

Let $J_n$ be the $n\times n$ all-ones matrix. Find the characteristic polynomial of $J_n$ and $\spec(J_n)$.
Hint: You may apply the row operations that add each row to the first row.

Exercise 6¶

令 $A$ 和 $B$ 分別為 $m\times n$ 及 $n\times m$ 矩陣,並令

$$ M = \begin{bmatrix} O_{n,n} & B \\ A & O_{m,m} \end{bmatrix}. $$

Let $A$ and $B$ be $m\times n$ and $n\times m$ matrices, respectively, and

$$ M = \begin{bmatrix} O_{n,n} & B \\ A & O_{m,m} \end{bmatrix}. $$
Exercise 6(a)¶

假設 $x \neq 0$,參考 408-5 計算

$$ \det\begin{bmatrix} -xI_n & B \\ A & -xI_m \end{bmatrix}. $$

Suppose $x \neq 0$. Use the techniques in 408-5 to find

$$ \det\begin{bmatrix} -xI_n & B \\ A & -xI_m \end{bmatrix}. $$
Exercise 6(b)¶

利用行列式值的連續性來補足 $x = 0$ 的狀況。
求 $M$ 的特徵多項式。

By the continuity of determinant, show your answer in the previous problem also holds when $x = 0$. Find the characteristict polynomial of $M$.

Exercise 6(c)¶

若 $m\geq n$,證明 $AB$ 和 $BA$ 有相同的非零特徵值集合。

Suppose $m\geq n$. Show that $AB$ and $BA$ have the same set of nonzero eigenvalues, including the multiplicities.

Remark¶

以上的手法稱作連續性論證(continuity argument)。
由於矩陣不可逆的情況只發生在行列式值為零的時候,
一般來說我們都可以先假設矩陣可逆,再看看是否能用連續性處理不可逆的情況。

The above method is an example of the continuity argument. Since a matrix is singular only when its determinant is $0$, we may generally assume that the matrices are invertible and then use the invertible case to deal with the singular case.

Exercise 7¶

令 $J_{m,n}$ 為 $m\times n$ 的全 $1$ 矩陣,而

$$ A = \begin{bmatrix} O_{n,n} & J_{n,m} \\ J_{m,n} & O_{m,m} \end{bmatrix}. $$

求 $A$ 的特徵多項式及 $\spec(A)$。

Let $J_{m,n}$ be the $m\times n$ all-ones matrix and

$$ A = \begin{bmatrix} O_{n,n} & J_{n,m} \\ J_{m,n} & O_{m,m} \end{bmatrix}. $$

Find the characteristic polynomial of $A$ and $\spec(A)$.

Exercise 8¶

令 $V$ 為 $\mathbb{R}^3$ 中的一個二維空間,
而 $f:\mathbb{R}^3 \rightarrow \mathbb{R}^3$ 將向量 $\bv\in\mathbb{R}^3$ 投影到 $V$ 上。
求 $f$ 的特徵多項式。

Let $V$ be a $2$-dimensional subspace in $\mathbb{R}^3$. Let $f:\mathbb{R}^3 \rightarrow \mathbb{R}^3$ be the projection that sends vectors in $\bv\in\mathbb{R}^3$ onto $V$. Find the characteristic polynomial of $f$.

Exercise 9¶

令 $V$ 為 $\mathbb{R}^3$ 中的一個二維空間,
而 $f:\mathbb{R}^3 \rightarrow \mathbb{R}^3$ 將向量 $\bv\in\mathbb{R}^3$ 鏡射到 $V$ 的對面。
求 $f$ 的特徵多項式。

Let $V$ be a $2$-dimensional subspace in $\mathbb{R}^3$. Let $f:\mathbb{R}^3 \rightarrow \mathbb{R}^3$ be the reflection that sends vectors in $\bv\in\mathbb{R}^3$ to the other side of $V$. Find the characteristic polynomial of $f$.

Exercise 10¶

令 $A$ 為一 $n\times n$ 矩陣。
對 $i = 1,\ldots,n$,令 $A(i)$ 為將 $A$ 的第 $i$ 行及第 $i$ 列拿掉所得的子矩陣。

證明

$$ \frac{dp_A(x)}{dx} = -\sum_{i = 1}^n p_{A(i)}(x). $$

提示:
在計算 $\det(A - xI)$ 時可以先把裡面的 $n$ 個 $x$ 當作獨立的變數 $x_1,\ldots, x_n$。
接下來搭配 409-2 及連鎖律

$$ \frac{dp_A(x)}{dx} = \sum_{i = 1}^n \frac{\partial p_A(x)}{\partial x_i} \frac{d x_i}{dx} $$

來計算微分。

Let $A$ be an $n\times n$ matrix. For $i = 1,\ldots,n$, let $A(i)$ be the matrix obtained from $A$ by removing its $i$-th row and column.

Show that

$$ \frac{dp_A(x)}{dx} = -\sum_{i = 1}^n p_{A(i)}(x). $$

Hint: You may consider the $n$ occurrences of $x$ in $\det(A - xI)$ as independent variables $x_1,\ldots, x_n$. Then use 409-2 and the chain rule

$$ \frac{dp_A(x)}{dx} = \sum_{i = 1}^n \frac{\partial p_A(x)}{\partial x_i} \frac{d x_i}{dx} $$

to find its derivative.