Notes on linear algebra follows [@Horn & Johnson, 1990. Matrix Analysis.]

Scanned course notes: Notes on linear algebra; Notes on advanced algebra;

Symbols

  • $\bar{\mathbb{F}}$: algebraically closed field, e.g. $\bar{\mathbb{C}}$, $\bar{\mathbb{R}}$, and $\bar{\mathbb{Q}}$.
  • $\mathbb{R}^n, \mathbb{C}^n$: $n$-dementional vector spaces based on $\mathbb{R}$ and $\mathbb{C}$.
  • $\{ e_1, \cdots, e_n \}$: standard basis of $\mathbb{R}^n$ or $\mathbb{C}^n$.
  • $A$, $\mathcal{F}$: a matrix and a family of matrices;
  • $M_{m,n}(\mathbb{F})$, $M_n(\mathbb{F})$: the set of all $m$-by-$n$ / $n$-by-$n$ matrices over field $\mathbb{F}$, which defaults to $\mathbb{C}$;
  • $GL(n, \mathbb{F})$: the general linear group formed by nonsingular matrices in $M_n(\mathbb{F})$.
  • $\mathcal{U}$: the set of all unitary matrices, which is a compact subgroup of $GL(n, \mathbb{C})$.
  • $A \succeq 0$, $A \ge 0$: positive semi-definite matrix and nonnegative matrix;
  • $|A|$: matrix of absolute values of entries of a matrix.
  • $A^T, \bar{A}, A^∗$: transpose, conjugate, and Hermitian adjoint (i.e. conjugate transpose).
  • $A^{-1}$: inverse of a nonsingular matrix;
  • $A^{1/2}$: the unique positive semidefinite square root of a positive semidefinite matrix;
  • $\sigma_i(A)$: the $i$-th largest singular value of a matrix.
  • $\lambda_i(A)$: the $i$-th eigenvalue of a square matrix; in increasing order if $A$ is Hermitian.
  • $\rho(A)$: spectral radius of a matrix, the largest modulus of its eigenvalues;
  • $\Sigma(A)$: the spectrum of a square matrix, the set of all its eigenvalues.
  • $\mathrm{det} A$: determinant of a square matrix.
  • $p_A(t), q_A(t)$: the characteristic / minimal polynomial of a square matrix.
  • $e_{k}(z_{1}, \dots , z_{n})$: symmetric functions.

Vector Space and $\mathbb{F}^n$

Vector space $V$, the fundamental setting for matrix theory, a set closed in linear combination (addition, scalar multiplication) over a (scalar) field, e.g. $\mathbb{R}^n$, $\mathbb{C}^n$, $C^0[0,1]$. Scalar field (域) $\mathbb{F}$, which admits addition and multiplication (with inverses), e.g.$\mathbb{R}$, $\mathbb{C}$.

Subspace $U$, a subset of $V$ that is also a vector space. Linear span (线性生成空间) $\mathrm{span}(S)$ of a set of vectors $S$, all vectors that can be written as (finite) linear combinations of the set.

Linear independence of a set of vectors, non-zero linear combinations of the set must be non-zero.

Basis $\mathcal{B}$, a linearly independent set such that $\mathrm{Span}(\mathcal{B}) = V$. Dimension $\mathrm{dim}(V)$ of a vector space, the size of any basis of the vector space. (Only for finite dimensional vector spaces.)

Table: The intersection of linearly independent sets and span-to-space sets are bases.

Elements $1, \dots, \mathrm{dim}(V)$ $\mathrm{dim}(V)$ $\mathrm{dim}(V), \dots$
Sets linearly independent basis span to $V$

Isomorphism (同构) are vector spaces that can be related by invertible linear functions. For finite dimensional vector spaces $U, V$ over the same field, $U \cong V \iff \dim U= \dim V$. For any basis $\mathcal{B}$ of an $n$-dimensional vector space $V$ over field $\mathbb{F}$, the mapping from a vector to its "coordinates" $f: \mathbf{x} \rightarrow [\mathbf{x}]_{\mathcal{B}}$ is an isomorphism between $V$ and $\mathbb{F}^n$. The above feature is analogous to random variables. It justifies the re-abstraction (second order abstration) from vector space $V$ to $\mathbb{F}^n$, especially $\mathbb{R}^n$ and $\mathbb{C}^n$.

Euclidean Space

Euclidean spaces are generalizations of two- and three-dimensional spaces of Euclidean geometry to arbitrary finite-dimensional spaces. It is the abstraction of a geometric object into a topological and algebraic structure. Many concepts in algebra and analysis are developed by analogy with geometry, where Euclidean space is among the first few.

Euclidean space $\mathbb{R}^n$ is the set of all $n$-tuples of real numbers $(x_1,\dots, x_n)$, where inner product $\mathbf{x}\cdot\mathbf{y} = \sum_{i=1}^n x_i y_i$.

Length. Angle. Orthogonal.

Orthonormal basis. Gram-Schmidt orthogonalization/orthonormalization, QR decomposition. Orthogonal transformation: rotation, reflection; improper rotation.

Space decomposition. Orthogonal subspace. Direct sum. Orthogonal complement.

Isometry (等距同构), or congruent transformation, is a distance-preserving (bijective) transformation between two metric spaces: given metric spaces $(X, d_x)$ and $(Y, d_y)$, bijection $f: X \to Y$ is an isometry iff $d_x(x_1, x_2) = d_y(f(x_1), f(x_2)), \forall x_1, x_2 \in X$. Two metric spaces are isometric iff there exists an isometry between them.

Theorem: Any finite-dimensional linear inner-product space over the real numbers is isometric with the Euclidean space of the same dimension. $V \cong \mathbb{R}^n$, where vector space $V$ over $\mathbb{R}$ has $\mathrm{dim}(V) = n$ and inner product $\langle \cdot, \cdot \rangle$.

Linear Transformation and Matrix

Matrix is a rectangular array of scalars in $\mathbb{F}$, which can also be seen as a linear transformation between two vector spaces based on $\mathbb{F}$.

Matrix multiplication is the composition of linear transformations, which is associative. Matrix addition is the addition of linear transformations.

Scalar matrices are scalar multiple of the identity matrix.

Four Fundamental Subspaces

For a linear transformation $A \in M_{m,n}(\mathbb{R})$ with rank $r$ and singular value decomposition $A = U \Sigma V^T$, its four fundamental subspaces are summarized in the following table.

Table: Four fundamental subspaces of a linear transformation

Image Kernel Coimage Cokernel
Alternative name Column space Null space Row space Left null space
Notation $\mathrm{im}(A)$ $\mathrm{ker}(A)$ $\mathrm{im}(A^T)$ $\mathrm{ker}(A^T)$
Alt notation $\mathrm{range}(A)$ $\mathrm{null}(A)$ $\mathrm{range}(A^T)$ $\mathrm{null}(A^T)$
Containing Space $\mathbb{R}^m$ $\mathbb{R}^n$ $\mathbb{R}^n$ $\mathbb{R}^m$
Dimension $r$ $n-r$ $r$ $m-r$
Basis $\{u_i\}_{i=1}^r$ $\{v_i\}_{i=r+1}^n$ $\{v_i\}_{i=1}^r$ $\{u_i\}_{i=r+1}^m$

The fundamental theorem of linear algebra:

  1. The kernel is the orthogonal complement of the coimage: $\mathrm{ker}(A)^\perp = \mathrm{im}(A^T)$
  2. The cokernel is the orthogonal complement of the image: $\mathrm{ker}(A^T) = \mathrm{im}(A)^\perp$

Matrix Decomposition/Factorization

Matrix decompositions/factorizations express a matrix as the product of smaller or simpler matrices.

  • QR decomposition ~ Gram–Schmidt orthogonalization process: $X = Q R$, where $X \in M_{m,n}$, $Q$ is orthogonal/unitary, $R$ is upper-triangular; $\{ \vec{x}_1, \cdots, \vec{x}_n \} \Rightarrow \{ \vec{q}_1, \cdots, \vec{q}_n \}$;
  • LU (LUP) decomposition ~ Gaussian elimination for solving systems of linear equations: $X = L U$ where $L$ is lower-triagular and $U$ is upper-triangular, if $X \in M_n$ and $\mathrm{det}[X]_{1 \dots i} \ne 0, i = 1, \dots, n$;
    • Cholesky decomposition: $X = L L^∗$ where $L$ is lower-triangular, if $X \succeq 0$;
  • Jordan decomposition ~ eigenvalue problem: $X = P J P^{-1}$ where $X \in M_n$, $J$ is a Jordan matrix, and $P$ is invertible;
    • Eigendecomposition (spectral decomposition): $X = P \Lambda P^{-1}$ where $\Lambda$ is diagonal, if $X \in M_n$ and has linearly independent eigenvectors;
      • $X = U \Lambda U^∗$ where $U$ is orthogonal/unitary, if $X$ is a normal matrix (e.g. orthogonal/unitary, (skew-)symmetric/Hermitian);
  • QZ decomposition ~ generalized eigenvalue problem;
  • Singular value decomposition (SVD) ~ (dimensionality reduction, "concept space"): $X = V \Sigma W^∗$, where $X \in M_{m,n}$, $V, W$ are orthogonal/unitary, and $\Sigma$ is non-negative diagonal;

Eigenvalue, Eigenvector, and Similarity

Jordan Canonical Form

Location and perturbation of eigenvalues.

algebraic multiplicity, geometric multiplicity.

Note: Eigenvalues <-> Invertibility; Eigenvectors <-> Diagonalizability.

Miscellaneous Topics


🏷 Category=Algebra