Unitary matrix (酉矩阵) is a square matrix whose inverse is its Hermitian adjoint (共轭转置): $U \in M_n$, $U^H U = I_n$. Orthogonal matrix (正交矩阵) is a square matrix whose inverse is its transpose: $Q \in M_n$, $Q^T Q = I_n$. Real orthogonal matrices are unitary; complex orthogonal matrices are not. Every unitary matrix can be represented by a real orthogonal matrix and a real symmetric matrix: $\forall U \in U(n)$, $\exists Q \in O(n), S \in S(n)$: $U = Q e^{i S}$. Every complex orthogonal matrix can be represented by a real orthogonal matrix and a real skew-symmetric matrix: $\forall P \in M_n$, $P^{-1} = P^T$, $\exists Q \in O(n)$, $\Omega \in \Omega(n)$: $P = Q e^{i \Omega}$.

A set of vectors $\{x_1, \dots, x_n\}$ is an orthogonal set if $\forall i \ne j, x_i^H x_j = 0$. It is orthonormal (标准正交) if, in addition, $\forall i, \|x_i\| = 1$. An orthogonal set of nonzero vectors is linearly independent.

Theorem: TFAE:

  1. $U$ is unitary;
  2. $U$ nonsingular and $U^{-1} = U^H$;
  3. $U U^H = I$;
  4. $U^H$ is unitary;
  5. Columns of $U$ are orthonormal;
  6. Rows of $U^H$ orthonormal;
  7. $U$ is an isometry: $(U x)^H (U x) = x^H x, \forall x$

Schur's Lemma (unitary triangularization): Any square matrix can be unitarily triangularized with its eigenvalues on the diagonal: $\forall A \in M_n$, $\exists U \in U(n)$, $T \in T(n)$ (upper triangular), $\text{diag}(A) = \lambda(A)$: $U^H A U = T$. Any square real matrix with real eigenvalues can be orthogonally triangularized with its eigenvalues on the diagonal.

Theorem: Any communting family of square matrices can be simultaenously unitarily triangularized: $\mathcal{F} \subset M_n$, $\forall A, B \in \mathcal{F}$, $A B = B A$, $\exists U \in U(n)$: $\forall A \in \mathcal{F}$, $U^H A U \in T(n)$ (upper triangular). Any communting family of real square matrices with real eigenvalues, can be simultaenously orthogonally block triangularized, with diagonal blocks 1-by-1 or 2-by-2.

Similarity Transforms

For an arbitrary square matrix $A$, there exist $M$ whose columns are eigenvectors and "generalized eigenvectors" of $A$, and $J$ a Jordan form, such that $M^{-1} A M = J$. If $A$ is diagonable, then for any basis $S$ whose columns are eigenvectors of $A$, we have $S^{-1} A S = \Lambda$ where $\Lambda$ is a diagonal matrix with the eigenvalues of $A$ on the diagonal.

(Note: Similarity transforms preserve eigenvalues but not eigenvectors.)

Theorem ("Symmetric Jordan form"; proof not shown): Every matrix is similar to a symmetric matrix.

Normal Matrix

Normal matrix (正规矩阵) is a square matrix that commutes with its Hermitian adjoint: $A \in M_n$, $A A^H = A^H A$; or equivalently (see the spetral theorem below), a square matrix that is unitarily diagonalizable: $\exists U \in U(n)$, $\lambda \in \mathbb{C}^n$: $A = U \Lambda U^H$. Comparing with Schur's Lemma, normal matrices are exactly those square matrices that can be unitarily diagonalized, instead of just being triangularized. Real normal matrices is real orthogonally block diagonalizable, where the diagonal consists of real eigenvalues or real matrices of the form [a b; -b a] corresponding to complex conjugate pairs of eigenvalues $a \pm i b$.

Spectral theorem for normal matrices: Let $A \in M_n$, TFAE: (1) $A$ is normal; (2) $A$ is unitarily diagonalizable; (3) There exists an orthonormal set of $n$ eigenvectors of $A$. (4) $\|A\|_F = \|\lambda(A)\|$, that is, $\sum_{i,j=1}^n |a_{ij}|^2 = \sum_{i=1}^n |\lambda_i|^2$.

Special cases of normal matrices as identified by their eigenvalues: (1) Hermitian, $\lambda \subset \mathbb{R}$; (2) skew-Hermitian, $\lambda \subset i \mathbb{R}$; (3) unitary, $\lambda \subset e^{i \mathbb{R}}$. All normal matrices have unitary bases of eigenvectors, but the real versions of these special cases (real symmetric, real skew-symmetric, and real orthogonal) can have orthogonal bases of eigenvectors.


🏷 Category=Algebra Category=Matrix Theory