Summary of Matrix Analysis [@Horn1990]

Course project: Application of SVD to Signal Processing

Table: Comparison of concepts in Real and Complex field

$\mathbb{C}$ $\mathbb{R}$
Unitary Orthogonal
Hermitian Symmetric
skew Hermitian skew Symmetric

Both Euclidean norm and spectral norm are unitarily invariant (remarks from p. 291~292 and p. 296). In fact, spectral norm is the minimal one (remarks from p. 308), which makes SVD very suitable for solving problems concerning least square (example 7.4.1 from p. 427)

Corollary 5.6.14 (p. 299) gives a very useful convergent series to compute ρ(A), which is generalized in corollary 5.7.10 (p. 322) for pre-norms.

A matrix "Taylor series" converges if its normed version converges (theorem 5.6.15 from p. 300) Thus we can obtain Taylor expansions for matrices. e.g. $e^A$ (exercise from p. 300), $\cos A$ (exercise from p. 301) and $A^{-1}$ (corollary 5.6.16 from p. 301) Further, given $A = S^{-1} \Lambda S$, we have: (exercise from p. 300)

$$f(A) \equiv S^{-1} \mathrm{diag}(f(\lambda_1), \dots, f(\lambda_n)) S$$


🏷 Category=Algebra Category=Matrix Analysis