代數圖論觀點¶

Interpretation through graph theory

Creative Commons License
This work by Jephian Lin is licensed under a Creative Commons Attribution 4.0 International License.

$\newcommand{\trans}{^\top} \newcommand{\adj}{^{\rm adj}} \newcommand{\cof}{^{\rm cof}} \newcommand{\inp}[2]{\left\langle#1,#2\right\rangle} \newcommand{\dunion}{\mathbin{\dot\cup}} \newcommand{\bzero}{\mathbf{0}} \newcommand{\bone}{\mathbf{1}} \newcommand{\ba}{\mathbf{a}} \newcommand{\bb}{\mathbf{b}} \newcommand{\bc}{\mathbf{c}} \newcommand{\bd}{\mathbf{d}} \newcommand{\be}{\mathbf{e}} \newcommand{\bh}{\mathbf{h}} \newcommand{\bp}{\mathbf{p}} \newcommand{\bq}{\mathbf{q}} \newcommand{\br}{\mathbf{r}} \newcommand{\bx}{\mathbf{x}} \newcommand{\by}{\mathbf{y}} \newcommand{\bz}{\mathbf{z}} \newcommand{\bu}{\mathbf{u}} \newcommand{\bv}{\mathbf{v}} \newcommand{\bw}{\mathbf{w}} \newcommand{\tr}{\operatorname{tr}} \newcommand{\nul}{\operatorname{null}} \newcommand{\rank}{\operatorname{rank}} %\newcommand{\ker}{\operatorname{ker}} \newcommand{\range}{\operatorname{range}} \newcommand{\Col}{\operatorname{Col}} \newcommand{\Row}{\operatorname{Row}} \newcommand{\spec}{\operatorname{spec}} \newcommand{\vspan}{\operatorname{span}} \newcommand{\Vol}{\operatorname{Vol}} \newcommand{\sgn}{\operatorname{sgn}} \newcommand{\idmap}{\operatorname{id}} \newcommand{\am}{\operatorname{am}} \newcommand{\gm}{\operatorname{gm}} \newcommand{\mult}{\operatorname{mult}} \newcommand{\iner}{\operatorname{iner}}$

In [ ]:
from lingeo import random_int_list
from gnm import matrix_digraph, effective_permutations, illustrate_det

Main idea¶

In graph theory, a (simple) graph is a pair $G = (V(G),E(G))$ such that $E(G)$ is a set of $2$-subsets of $V(G)$.
An element in $V(G)$ is called a vertex , while an element in $E(G)$ is called an edge .

A directed graph (or digraph) is a pair $\Gamma = (V(\Gamma),E(\Gamma))$ such that $E(\Gamma)$ is a set of $2$-tuples $(i,j)$ with $i,j\in V(\Gamma)$.
An element in $E(\Gamma)$ is called a directed edge .
A directed edge of the form $(i,i)$ is called a loop on vertex $i$.

A weighted graph is a graph $G$ along with a function $w:E(G) \rightarrow \mathbb{R}$
and $w(e)$ is called the weight of the edge $e$.
When $e = \{i,j\}$, we often write $w(i,j)$ for $w(e)$.
A weighted digraph is similarly defined.

When $A = \begin{bmatrix} a_{ij} \end{bmatrix}$ is an $n\times n$ matrix, the weighted digraph of $A$ is the weighted digraph $\Gamma$ with

  • $V(\Gamma) = [n]$,
  • $E(\Gamma) = \{(i,j): a_{ij} \neq 0 \}$, and
  • $w(i,j) = a_{ij}$,

denoted by $\Gamma(A) = \Gamma$.

Let $\Gamma$ be a weighted digraph on $n$ vertices.
An elementary subgraph of $\Gamma$ is a subgraph $H$ of $\Gamma$ such that

  • $V(H) = V(\Gamma)$,
  • $E(H) \subseteq E(\Gamma)$, and
  • every vertex of $H$ has exactly one directed edge leaving it and exactly one directed edge arriving it.

The set of all elementary subgraphs of $\Gamma$ is denoted by $\mathfrak{E}(\Gamma)$.

Every elementary subgraph must be composed of some cycles.
Let $c(H)$ be the number of cycles, including loops, doubly directed edges, and cycles of length at least $3$.
The sign of $H$ is defined as $\sgn(H) = (-1)^{n + c(H)}$.
The weight of $H$ is defined as the product of the weights of every edges of $H$.

Permutation expansion (graph theory version)¶

Let $A$ be an $n\times n$ matrix and $\Gamma$ its weighted digraph.
Then

$$ \det(A) = \sum_{H\in\mathfrak{E}(\Gamma)} \sgn(H) w(H). $$

Side stories¶

  • adjacency matrix
  • companion matrix

Experiments¶

Exercise 1¶

執行以下程式碼。

Run the code below.

In [ ]:
### code
set_random_seed(0)
print_ans = False

n = 6
b = 18
while True:
    l = random_int_list(n^2 - b, 3) + [0]*b
    shuffle(l)
    A = matrix(n, l)
    perms = effective_permutations(A)
    if len(perms) >= 3 and len(perms) <= 8:
        break

pretty_print(LatexExpr("A ="), A)

if print_ans:
    Gamma = matrix_digraph(A)
    Gamma.show(edge_labels=True)
    illustrate_det(A)
    print("det(A) =", A.det())
Exercise 1(a)¶

畫出 $\Gamma(A)$ 並標上每條邊的權重。

Draw $\Gamma(A)$ and mark the weight on each edge.

Exercise 1(b)¶

求出所有的基本子圖,計算其正負號以及權重。

Find all elementary matrices, their signs, and their weights.

Exercise 1(c)¶

計算 $\det(A)$。

Find $\det(A)$.

Exercises¶

Exercise 2¶

令

$$ A = \begin{bmatrix} 0 & 1 & 0 & 0 \\ 1 & 0 & 1 & 1 \\ 0 & 1 & 0 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix} $$

且 $\Gamma$ 為其賦權有向圖。
找出所有 $\Gamma$ 的基本子圖,並藉此求出 $\det(A)$。

Let

$$ A = \begin{bmatrix} 0 & 1 & 0 & 0 \\ 1 & 0 & 1 & 1 \\ 0 & 1 & 0 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix} $$

and $\Gamma$ its weighted directed graph. Find all elementary subgraphs of $\Gamma$ and use them to find $\det(A)$.

Exercise 3¶

對以下的矩陣 $A$,
令 $\Gamma$ 為其賦權有向圖。
找出所有 $\Gamma$ 的基本子圖,並藉此求出 $\det(A)$。

For each of the following matrices, let $\Gamma$ be its weighted directed graph. Find all elementary subgraphs of $\Gamma$ and use them to find $\det(A)$.

Exercise 3(a)¶
$$ A = \begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 1 \\ 0 & 1 & 0 \end{bmatrix}. $$
Exercise 3(b)¶
$$ A = \begin{bmatrix} 0 & 1 & 0 & 0 \\ 1 & 0 & 1 & 0 \\ 0 & 1 & 0 & 1 \\ 0 & 0 & 1 & 0 \end{bmatrix}. $$
Exercise 3(c)¶

令 $A$ 為 $n\times n$ 矩陣

$$ A = \begin{bmatrix} 0 & 1 & 0 & \cdots & 0 \\ 1 & 0 & 1 & \ddots & \vdots \\ 0 & 1 & 0 & \ddots & 0 \\ \vdots & \ddots & \ddots & \ddots & 1 \\ 0 & \cdots & 0 & 1 & 0 \end{bmatrix}. $$

Let $A$ be the $n\times n$ matrix

$$ A = \begin{bmatrix} 0 & 1 & 0 & \cdots & 0 \\ 1 & 0 & 1 & \ddots & \vdots \\ 0 & 1 & 0 & \ddots & 0 \\ \vdots & \ddots & \ddots & \ddots & 1 \\ 0 & \cdots & 0 & 1 & 0 \end{bmatrix}. $$
Exercise 4¶

對以下的矩陣 $A$,
令 $\Gamma$ 為其賦權有向圖。
找出所有 $\Gamma$ 的基本子圖,並藉此求出 $\det(A)$。

For each of the following matrices, let $\Gamma$ be its weighted directed graph. Find all elementary subgraphs of $\Gamma$ and use them to find $\det(A)$.

Exercise 4(a)¶
$$ A = \begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{bmatrix}. $$
Exercise 4(b)¶
$$ A = \begin{bmatrix} 0 & 1 & 0 & 1 \\ 1 & 0 & 1 & 0 \\ 0 & 1 & 0 & 1 \\ 1 & 0 & 1 & 0 \end{bmatrix}. $$
Exercise 4(c)¶

令 $A$ 為 $n\times n$ 矩陣

$$ A = \begin{bmatrix} 0 & 1 & 0 & \cdots & 0 & 1\\ 1 & 0 & 1 & \ddots & ~ & 0\\ 0 & 1 & 0 & \ddots & ~ & \vdots\\ \vdots & \ddots & \ddots & \ddots & 1 & 0 \\ 0 & ~ & ~ & 1 & 0 & 1 \\ 1 & 0 & \cdots & 0 & 1 & 0 \end{bmatrix}. $$

Let $A$ be the $n\times n$ matrix

$$ A = \begin{bmatrix} 0 & 1 & 0 & \cdots & 0 & 1\\ 1 & 0 & 1 & \ddots & ~ & 0\\ 0 & 1 & 0 & \ddots & ~ & \vdots\\ \vdots & \ddots & \ddots & \ddots & 1 & 0 \\ 0 & ~ & ~ & 1 & 0 & 1 \\ 1 & 0 & \cdots & 0 & 1 & 0 \end{bmatrix}. $$
Exercise 5¶

令

$$ A = \begin{bmatrix} 0 & 0 & \cdots & 0 & a_n \\ 1 & 0 & ~ & \vdots & \vdots \\ 0 & \ddots & \ddots & ~ & ~ \\ \vdots & \ddots & ~ & 0 & a_2 \\ 0 & \cdots & 0 & 1 & a_1 \end{bmatrix}. $$

求 $\det(A)$。

Let

$$ A = \begin{bmatrix} 0 & 0 & \cdots & 0 & a_n \\ 1 & 0 & ~ & \vdots & \vdots \\ 0 & \ddots & \ddots & ~ & ~ \\ \vdots & \ddots & ~ & 0 & a_2 \\ 0 & \cdots & 0 & 1 & a_1 \end{bmatrix}. $$

Find $\det(A)$.

Exercise 6¶

令 $A$ 和 $B$ 為 $m\times n$ 矩陣且 $m\neq n$。
說明

$$ M = \begin{bmatrix} O_n & B\trans \\ A & O_m \end{bmatrix} $$

所對應到的賦權有向圖並不擁有任何基本子圖。
因此這樣的矩陣一定有 $\det(M) = 0$。

Let $A$ and $B$ be $m\times n$ matrices with $m\neq n$. Explain why the weighted directed graph of

$$ M = \begin{bmatrix} O_n & B\trans \\ A & O_m \end{bmatrix} $$

contains no elementary subgraph. Therefore, any matrix of this form has $\det(M) = 0$.