Notes on linear algebra follows [@Horn & Johnson, 1990. Matrix Analysis.]
Scanned course notes: Notes on linear algebra; Notes on advanced algebra; Notes on vector/linear space;
Vector space (sometimes called linear space) $(V, (+, \cdot_{\mathbb{F}}))$ is a set $V$ closed in linear combination (vector addition $+$ and scalar multiplication $\cdot$) over a (scalar) field $\mathbb{F}$, e.g. $\mathbb{R}^n$ and $C^0[0,1]$, both with the underlying field $\mathbb{R}$. Scalar field (域) $\mathbb{F}$ is a set of numbers which admits addition and multiplication (with inverses), e.g.$\mathbb{R}$, $\mathbb{C}$.
Subspace $U$ is a subset of $V$ that is also a vector space. Linear span (线性生成空间) $\mathrm{span}(S)$ of a set of vectors $S$, all vectors that can be written as (finite) linear combinations of the set.
Linear independence of a set of vectors, non-zero linear combinations of the set must be non-zero.
Basis $\mathcal{B}$, a linearly independent set such that $\mathrm{Span}(\mathcal{B}) = V$. Dimension $\mathrm{dim}(V)$ of a vector space is the size of any basis of the vector space. The basis may be finite, countably infinite, or uncountably infinite; so is the dimension of the vector space.
Linear algebra deals with finite dimensional vector spaces.
Table: The intersection of linearly independent sets and span-to-space sets are bases.
Elements | $1, \cdots, \mathrm{dim}(V)$ | $\mathrm{dim}(V)$ | $\mathrm{dim}(V), \cdots$ |
---|---|---|---|
Sets | linearly independent | basis | span to $V$ |
Isomorphism (同构) are vector spaces that can be related by invertible linear functions. For finite dimensional vector spaces $U, V$ over the same field, $U \cong V \iff \dim U= \dim V$. For any basis $\mathcal{B}$ of an $n$-dimensional vector space $V$ over field $\mathbb{F}$, the mapping from a vector to its "coordinates" $f: v \rightarrow [v]_{\mathcal{B}}$ is an isomorphism between $V$ and $\mathbb{F}^n$. The above feature is analogous to random variables. It justifies the re-abstraction (second order abstration) from vector space $V$ to $\mathbb{F}^n$, especially $\mathbb{R}^n$ and $\mathbb{C}^n$.
Euclidean space $(\mathbb{R}^n, (\cdot, \cdot))$ is the set $\mathbb{R}^n$ of all $n$-tuples of real numbers $(x_i)_{i=1}^n, x_i \in \mathbb{R}$ with inner product $x \cdot y = \sum_{i=1}^n x_i y_i$. Euclidean spaces are generalizations of two- and three-dimensional spaces of Euclidean geometry to arbitrary finite-dimensional spaces. It is the abstraction of a geometric object into a topological and algebraic structure. Many concepts in algebra and analysis are developed by analogy with geometry, where Euclidean space is among the first few.
Length. Angle. Orthogonal.
Orthonormal basis. Gram-Schmidt orthogonalization/orthonormalization, QR decomposition. Orthogonal transformation: rotation, reflection; improper rotation.
Space decomposition. Orthogonal subspace. Direct sum. Orthogonal complement.
Isometry (等距同构), or congruent transformation, is a distance-preserving (bijective) mapping between two metric spaces: given metric spaces $(X, d_x)$ and $(Y, d_y)$, bijection $f: X \to Y$ is an isometry iff $d_x(x_1, x_2) = d_y(f(x_1), f(x_2)), \forall x_1, x_2 \in X$. Two metric spaces are isometric iff there exists an isometry between them.
Theorem: Any finite-dimensional inner product space over the real numbers is isometric with the Euclidean space of the same dimension. $V \cong \mathbb{R}^n$, where vector space $V$ over $\mathbb{R}$ has $\mathrm{dim}(V) = n$ and inner product $\langle \cdot, \cdot \rangle$.
Linear operator is an operator between two vector spaces over a field, which is compatible with their linear structures (vector addition, scalar multiplication). The study of linear operators on finite-dimensional spaces is called Matrix Theory. Linear functional is a linear operator which maps a vector space to its scalar field. Examples of linear operators include systems of linear equations, coordinate transformations, differentiation, (indefinite) integration, and the Fourier integral transform.
An important question in linear operator theory is classifying endomorphisms (linear transformations) with respect to some equivalence concept, e.g. similarity, topological equivalence, unitary equivalence.
Matrix is a rectangular array of scalars in $\mathbb{F}$, which can also be seen as the coordinate representation of a linear operator between two vector spaces with the underlying field $\mathbb{F}$, given certain bases. In the second perspective, the class $M_{m,n}(\mathbb{F})$ of matrices are all the linear operators $A: \mathbb{F}^n \to \mathbb{F}^m$. Matrix addition is the addition of linear operators. Matrix multiplication is the composition of linear operators, which is associative. Scalar matrices are scalar multiple of the identity matrix.
For a linear operator $A \in M_{m,n}(\mathbb{R})$ with rank $r$ and singular value decomposition $A = U \Sigma V^T$, its four fundamental subspaces are summarized in the following table.
Table: Four fundamental subspaces of a linear operator
Image | Kernel | Coimage | Cokernel | |
---|---|---|---|---|
Alternative name | Column space | Null space | Row space | Left null space |
Notation | $\mathrm{im}(A)$ | $\mathrm{ker}(A)$ | $\mathrm{im}(A^T)$ | $\mathrm{ker}(A^T)$ |
Alt notation | $\mathrm{range}(A)$ | $\mathrm{null}(A)$ | $\mathrm{range}(A^T)$ | $\mathrm{null}(A^T)$ |
Containing Space | $\mathbb{R}^m$ | $\mathbb{R}^n$ | $\mathbb{R}^n$ | $\mathbb{R}^m$ |
Dimension | $r$ | $n-r$ | $r$ | $m-r$ |
Basis | $\{u_i\}_{i=1}^r$ | $\{v_i\}_{i=r+1}^n$ | $\{v_i\}_{i=1}^r$ | $\{u_i\}_{i=r+1}^m$ |
The fundamental theorem of linear algebra (not the fundamental theorem of algebra):
Matrix decompositions/factorizations express a matrix as the product of smaller or simpler matrices.
Two square matrices $A, B \in M_n$ are similar if they are matrix representations of the same linear operator in different bases: $\exists S \in GL(n): A = S B S^{-1}$. The transformation $B \to S B S^{-1}$ is called a similarity transformation by similarity matrix $S$.
An eigenvector of a square matrix is a vector that remains in its linear span after the linear transformation: $A x = \lambda x$. The eigenvalue $\lambda$ that corresponds to an eigenvector is the equivalent scaling when the eigenvector is applied to the linear transformation. The eigenspace that corresponds to an eigenvalue is the subspace where the linear transformation is equivalent to scaling by the eigenvalue: $\{x \mid A x = \lambda x\}$.
Jordan Canonical Form: Any square matrix is similar to a (complex) Jordan matrix, unique up to permutation of Jordan blocks: $A = P J P^{-1}$. In this norm, $\{ p_{1 + n_i} \}_{i = 0}^{k-1}$ is a basis for the direct sum of all the eigenspaces of $A$, which correspond to eigenvalues $\{\lambda_i\}_{i = 1}^k$ respectively. A family of similar matrices, i.e. an equivalence class under similarity, have the same Jordan canonical form.
A Jordan block $J_k(\lambda_i)$ is a $k \times k$ matrix with $\lambda_i$ on the diagonal and $1$ on the super diagonal: $J_k(\lambda_i) = \lambda_i I_k + U_k$, where $U_k$ is the upper/backward shift matrix $U_k = [0, I_{k-1}; 0, 0]$; specially, $J_1(\lambda_i) = [\lambda_i]$. A Jordan matrix is a direct sum of Jordan blocks, i.e. a block diagonal matrix consisting of Jordan blocks: $J = \mathrm{diag}\{ J_{n_i}(\lambda_i) \}_{i=1}^k$, $\sum_{i=1}^k n_i = n$.
The (algebraic) multiplicity of eigenvalue $\lambda_i$ is the multiplicity of it as a zero of the characteristic polynomial $p_A(t)$, i.e. the number of diagonal entries in the Jordan matrix that equal to $\lambda_i$. The geometric multiplicity of eigenvalue $\lambda_i$ is the maximum number of linearly independent eigenvectors associated with it, i.e. the number of Jordan blocks in the Jordan matrix that correspond to $\lambda_i$.
A square matrix is diagonalizable if it is similar to a diagonal matrix (Jordan blocks all size one). A square matrix is invertible iff its eigenvalues do no include zero. A square matrix is diagonalizable iff its eigenvectors can form a basis.
Location and perturbation of eigenvalues.