Two vectors are orthogonal if their inner product is zero: $x^H y = 0$. A countable set of vectors $\{x_1, \dots, x_k\}$ is orthogonal if it consists of vectors that are mutually orthogonal: $\forall i \ne j, x_i^H x_j = 0$. Two subspaces are orthogonal if any vector from one is orthogonal to any vector from the other: $\forall v \in \mathfrak{V}$, $\forall w \in \mathfrak{W}$, $v^H w = 0$. An orthogonal set of nonzero vectors is linearly independent, and is called an orthogonal basis. An orthonormal basis (标准正交) is an orthogonal basis of unit vectors: $x_i^H x_j = \delta_{ij}$. Matrix with orthonormal columns represents isometric embedding: $x^H y = (V x)^H (V y)$.

Unitary matrix (酉矩阵) is a square matrix that satisfies any of the following equivalent conditions: (1) its Hermitian adjoint (共轭转置) is its inverse: $U \in \text{GL}_n$, $U^{-1} = U^H$; or equivalently, $U \in M_n$, $U^H U = I_n$; (2) its columns $\{u_i\}_{i=1}^n$ or rows $\{u_i^∗\}_{i=1}^n$ are orthonormal; (3) it represents an isometry: $(U x)^H (U x) = x^H x$, $\forall x \in \mathbb{C}^n$. The name "unitary" means one, because all its eigenvalues have modulus one and all its singular values are one.

Orthogonal matrix

Orthogonal matrix (正交矩阵) is a square matrix whose inverse is its transpose [Sec 2.1, Prob 8]: $Q \in M_n$, $Q^T Q = I_n$. Real orthogonal matrices are unitary; complex orthogonal matrices are not. Every complex orthogonal matrix can be represented by a real orthogonal matrix and a real skew-symmetric matrix: $\forall P \in M_n$, $P^{-1} = P^T$, $\exists Q \in O(n)$, $\Omega \in \Omega(n)$: $P = Q e^{i \Omega}$. In comparison, every unitary matrix can be represented by a real orthogonal matrix and a real symmetric matrix: $\forall U \in U(n)$, $\exists Q \in O(n), S \in S(n)$: $U = Q e^{i S}$.

The set of complex orthogonal matrices is not a bounded set: ... Complex orthogonal matrix need not be normal: ... The set of complex orthogonal matrices of a certain order forms a group, with the action being matrix multiplication.

Orthogonal group $O(n)$ is the set of order-n real orthogonal matrices, with the action being matrix multiplication: $O(n) = \{Q \in M_n(\mathbb{R}) : Q Q^T = I_n\}$. It satisfies all group properties; in particular, it is closed under the group action: $\forall P, Q \in O(n), P Q \in O(n)$. The orthogonal group $O(n)$ has (manifold) dimension n(n-1)/2: for any $V \in \{(v_i)_{i=1}^{n-1} : v_i \in \mathbb{R}^{n-i}, |v_i| \le 1\}$, there is a finite number of real orthogonal matrices with V on the off-diagonal lower triangle. This does not conflict with the Jordan decomposition of real orthogonal matrices: with a real orthogonal block diagonal form $Q = \tilde{Q} \text{diag}(B_{\theta_j})_{j=1}^m \tilde{Q}^T$, where $B_\theta = [\cos\theta, \sin\theta; -\sin\theta, \cos\theta]$ and $\theta_j \in [0, \pi]$, it seems that $\dim O(n) = \dim O(n) + \lfloor n/2 \rfloor$; however, each of the 2-by-2 blocks introduces a redudant degree of freedom, because $B_\alpha B_\theta B_\alpha^T = B_\theta$.

Every real orthogonal and symmetric matrix is a reflection about a subspace: it has a unique representation $O = P_k - (I_n - P_k)$. The set of real orthogonal and symmetric order-n matrices is isomorphic to the set of all order-n symmetric projection matrices: $O(n) \cap S(n) \cong \cup_{k=0}^n \mathcal{P}(k,n)$, with bijection $f^{-1}(P) = P - (I - P)$. Since each component has (manifold) dimension $k (n - k)$, the maximal dimension of $O(n) \cap S(n)$ is obtained at $k = \lfloor n/2 \rfloor$, and is about a half of the dimension of $O(n)$ and $S(n)$. For $n = 2$, however, there are as many symmetric orthogonal matrices as asymmetric ones; in fact, $O(2) \cap S(2)$ is the other connected component of $O(2)$ besides $SO(2)$: $SO(2) = \{B_\theta : \theta \in [0, 2\pi)\}$, $O(2) \cap S(2) = \{S_\theta : \theta \in [0, 2\pi)\}$ where $S_\theta = [\cos\theta, \sin\theta; \sin\theta, -\cos\theta]$.

Schur decomposition

Schur's Lemma (unitary triangularization): Any square matrix can be unitarily triangularized with its eigenvalues on the diagonal: $\forall A \in M_n$, $\exists U \in U(n)$, $T \in T(n)$ (upper triangular), $\text{diag}(A) = \lambda(A)$: $U^H A U = T$. Any square real matrix with real eigenvalues can be real orthogonally triangularized with its eigenvalues on the diagonal.

Theorem: Any communting family of square matrices can be simultaenously unitarily triangularized: $\mathcal{F} \subset M_n$, $\forall A, B \in \mathcal{F}$, $A B = B A$, $\exists U \in U(n)$: $\forall A \in \mathcal{F}$, $U^H A U \in T(n)$ (upper triangular). Any communting family of real square matrices with real eigenvalues, can be simultaenously real orthogonally block triangularized, with diagonal blocks 1-by-1 or 2-by-2.

Normal Matrix

Normal matrix (正规矩阵) is a square matrix that satisfies any of the following equivalent conditions: let $A \in M_n$, (1) it commutes with its Hermitian adjoint: $A A^H = A^H A$; (2) it is unitarily diagonalizable: $\exists U \in U(n)$, $\lambda \in \mathbb{C}^n$: $A = U \Lambda U^H$; (3) there exists an orthogonal set of $n$ eigenvectors of $A$; (4) it has the same Euclidean norm as its eigenvalues: $\|A\|_F = \|\lambda(A)\|$. The equivalence of the above conditions is often called the spectral theorem for normal matrices.

Comparing with Schur's Lemma, normal matrices are exactly those square matrices that can be unitarily diagonalized, instead of just being triangularized. (Thm 2.5.8) Real normal matrices is real orthogonally block diagonalizable, where the diagonal consists of real eigenvalues or real matrices of the form [a b; -b a] corresponding to complex conjugate pairs of eigenvalues $a \pm i b$.

Special classes of normal matrices as identified by their eigenvalues: (1) Hermitian, $\lambda \subset \mathbb{R}$; (2) skew-Hermitian, $\lambda \subset i \mathbb{R}$; (3) unitary, $\lambda \subset e^{i \mathbb{R}}$. The class of normal matrices is invariance under Hermitian adjoint, so are all the special classes.


🏷 Category=Algebra Category=Matrix Analysis