Similarity Transforms

For an arbitrary square matrix $A$, there exist $M$ whose columns are eigenvectors and "generalized eigenvectors" of $A$, and $J$ a Jordan form, such that $M^{-1} A M = J$. If $A$ is diagonable, then for any basis $S$ whose columns are eigenvectors of $A$, we have $S^{-1} A S = \Lambda$ where $\Lambda$ is a diagonal matrix with the eigenvalues of $A$ on the diagonal.

(Note: Similarity transforms preserve eigenvalues but not eigenvectors.)

Theorem ("Symmetric Jordan form"; proof not shown): Every matrix is similar to a symmetric matrix.

Jordan Canonical Form

Proof

Notes from Matrix Analysis

Fact: $f(\lambda)$ is an eigenvalue of $f(A)$ with eigenvector $\mathbf{x}$.

Lemma: If $S$ diagonalizes $A$, then it also diagonalizes $f(A)$.

Fact: Suppose $A$ is invertible, then $g \in \mathbb{C}[Z, Z^{-1}]$ (generalized series of $A$) works too, i.e. $1/\lambda$ is an eigenvalue of $A^{-1}$.

A family $\mathcal{F}$ of matrices is commutable, if $A B = B A, \forall A, B \in \mathcal{F}$.

A family $\mathcal{F}$ of matrices is simultaneously diagonalizable, if exist $S$, s.t. $S^{-1} A S$ is diagonal, $\forall A \in \mathcal{F}$.

A subspace $\mathcal{W}$ is $A$-invariant, if $A w \in \mathcal{W}, \forall w \in \mathcal{W}$.

A subspace $\mathcal{W}$ is $\mathcal{F}$-invariant, if $\mathcal{W}$ is A-invariant, $\forall A \in \mathcal{F}$.

Lemma: If $\mathcal{F}$ is a commuting family, then $\exists \mathbf{x} \in \mathbb{C}^{n}$ that is a common eigenvector $\forall A \in \mathcal{F}$.

Theorem: If $\mathcal{F}$ be a family of diagonalizable matrices, then $\mathcal{F}$ is a commuting family iff it is simultaneously diagonalizable. (Ground field, e.g. $\mathbb{R}$ and $\mathbb{C}$, matters.)

Property: Given $A, B$ with eigenvalues $(\lambda_1, \dots, \lambda_n)$ and $(\mu_1, \dots, \mu_n)$, including multiplicities, and $A, B$ commute, then $A+B$ has eigenvalues $\lambda_1 + \mu_{\omega(1)}, \dots, \lambda_n + \mu_{\omega(n)}$, where $\omega \in S_n$.

Theorem (McCoy; without proof): Let $A, B \in M_n$ with eigenvalues $\{\lambda_1, \dots, \lambda_n\}$ and $\{\mu_1, \dots, \mu_n\}$ respectively, including multiplicities. Then exist $S$ in $GL(n, \mathbb{C})$ s.t. both $S^{-1} A S$ and $S^{-1} B S$ upper triangular iff $\sigma (f(A,B)) = \{f(\lambda_i, \mu_{\omega(i)}) \mid i=1, \dots, n\}$ for some $\omega \in S_n$ and all polynomials $f \in C$.

Theorem: Suppose $A$ has eigenvalues $\lambda_1, \dots, \lambda_k$ occurring $n_1, \dots, n_k$ times. Then $A$ is similar to $\mathrm{diag}(T_1, T_2, \dots, T_n)$, where $T_i$ is $n_i \times n_i$ upper triangular matrix, with diagonal elements $\lambda_i$. If A real and all its eigenvalues real, this can be done with a real similarity matrix.

A matrix N is nilpotent (幂零矩阵) if $N^k = 0$ for some k. (If $N \in M_n$ is nilpotent, then $N^n = 0$.)

Theorem: Let A be a strictly upper triangular matrix, then exist $S \in GL(n)$ and $n_1 \ge \dots \ge n_m > 0$ with $\Sigma n_i = n$ s.t. $S^{-1} A S = J_{n_1}(0) \oplus \dots \oplus J_{n_m}(0)$. Moreover, if A real, then S in $GL(n, \mathbb{R})$.

(Proof by induction.)

Theorem (Jordan Form): Let $A \in M_n$, then exist S in $GL(n, \mathbb{C})$, s.t. $S^{-1} A S = J_{n_1}(\lambda_1) \oplus \dots \oplus J_{n_k}(\lambda_k)$. If A real and all its eigenvalues real, then S can be real.

(Proof: Existence is now obvious. For uniqueness, consider $\mathrm{rank}(J- \lambda I)^m$.)

Eigendecomposition

...


🏷 Category=Algebra Category=Matrix Analysis