Skip to content

2. SVD 2


1. 키워드

  • Spectral Theorem(스펙트럴 정리)
  • Symmetric Matrix(대칭행렬)
  • Positive Definite Matrix


2. Computing SVD

First, we form \(A A^{T}\) and \(A^{T} A\) and compute eigendecomposition of each:

\[ A A^{T}=U \Sigma V^{T} V \Sigma^{T} U^{T}=U \Sigma \Sigma^{T} U^{T}=U \Sigma^{2} U^{T} \]
\[ A^{T} A=V \Sigma^{T} U^{T} U \Sigma V^{T}=V \Sigma^{T} \Sigma U^{T}=V \Sigma^{2} V^{T} \]


Can we find the following?

  • Orthogonal eigenvector matrices \(U\) and \(V\)
  • Eigenvalues in \(\Sigma^{2}\) that are all positive
  • Eigenvalues in \(\Sigma^{3}\) that are shared by \(A A^{T}\) and \(A^{T} A\)

Yes, since \(A A^{T}\) and \(A^{T} A\) are symmetric positive (semi-)definite.


3. Diagonalization of Symmetric Matrices

In general, \(A \in \mathbb{R}^{n \times n}\) is diagonalizable if and only if \(n\) linearly independent eigenvectors exist.

How about a symmetric matrix \(S \in \mathbb{R}^{n \times n}\), where \(S^{T}=S\)?

\(S\) is always diagonalizable.

Furthermore, \(S\) is orthogonally diagonalizable, meaning that their eigenvectors are not only linearly independent, but also orthogonal to each other.


4. Spectral Theorem of Symmetric Matrices

Consider a symmetric matrix \(S \in \mathbb{R}^{n \times n}\), where \(S^{T}=S\).

  • \(A\) has \(n\) real eigenvalues, counting multiplicities.
  • The dimension of the eigenspace for each eigenvalue equals the multiplicity of \(\lambda\) as a root of the characteristic equation.
  • The eigenspaces are mutually orthogonal. That is, eigenvectors corresponding to different eigenvalues are orthogonal.
  • To sum up, \(A\) is orthogonally diagonalizable.


5. Spectral Decomposition

Eigendecomposition of a symmetric matrix, also known as spectral decomposition, is represented as \(S=U D U^{-1}=U D U^{T}=\left[\begin{array}{llll}\mathbf{u}_{1} & \mathbf{u}_{2} & \cdots & \mathbf{u}_{n}\end{array}\right]\left[\begin{array}{cccc}\lambda_{1} & 0 & \cdots & 0 \\0 & \lambda_{2} & \ddots & \vdots \\\vdots & \ddots & \ddots & 0 \\0 & \cdots & 0 & \lambda_{n}\end{array}\right]\left[\begin{array}{c}\mathbf{u}_{1}^{T} \\\mathbf{u}_{2}^{T} \\\vdots \\\mathbf{u}_{n}^{T}\end{array}\right]=\left[\begin{array}{llll} \lambda_{1} \mathbf{u}_{1} & \lambda_{2} \mathbf{u}_{2} & \cdots & \lambda_{n} \mathbf{u}_{n} \end{array}\right]\left[\begin{array}{c} \mathbf{u}_{1}^{T} \\ \mathbf{u}_{2}^{T} \\ \vdots \\ \mathbf{u}_{n}^{T} \end{array}\right]=\lambda_{1} \mathbf{u}_{1} \mathbf{u}_{1}^{T}+\lambda_{2} \mathbf{u}_{2} \mathbf{u}_{2}^{T}+\cdots+\lambda_{n} \mathbf{u}_{n} \mathbf{u}_{n}^{T}\).

Each term, \(\lambda_{i} \mathbf{u}_{j} \mathbf{u}_{j}^{T}\) can be viewed as a projection matrix onto the subspace spanned by \(\mathbf{u}_{j}\), scaled by its eigenvalue \(\lambda_{i}\).


6. Positive Definite Matrices

Definition: \(A \in \mathbb{R}^{n \times n}\) is positive definite if \(\mathbf{x}^{T} A \mathbf{x}>0\), \(\mathbf{x}^{T} A \mathbf{x}>0\).

Definition: \(A \in \mathbb{R}^{n \times n}\) is positive semi-definite if \(\mathbf{x}^{T} A \mathbf{x} {\geq} 0\), \(\forall \mathbf{x} \neq \mathbf{0}\).


Theorem: \(A \in \mathbb{R}^{n \times n}\) is positive definite if and only if the eigenvalues of \(A\) are all positive.


7. Symmetric Positive Definite Matrices

If \(S \in \mathbb{R}^{n \times n}\) is symmetric and positive-definite, then the spectral decomposition will have all positive eigenvalues:

\[ S=U D U^{T}=\left[\begin{array}{llll}\mathbf{u}_{1} & \mathbf{u}_{2} & \cdots & \mathbf{u}_{n}\end{array}\right]\left[\begin{array}{cccc}\lambda_{1} & 0 & \cdots & 0 \\0 & \lambda_{2} & \ddots & \vdots \\\vdots & \ddots & \ddots & 0 \\0 & \cdots & 0 & \lambda_{n}\end{array}\right]\left[\begin{array}{c}\mathbf{u}_{1}^{T} \\\mathbf{u}_{2}^{T} \\\vdots \\\mathbf{u}_{n}^{T}\end{array}\right]=\lambda_{1} \mathbf{u}_{1} \mathbf{u}_{1}^{T}+\lambda_{2} \mathbf{u}_{2} \mathbf{u}_{2}^{T}+\cdots+\lambda_{n} \mathbf{u}_{n} \mathbf{u}_{n}^{T} \text{ where } \lambda_{j}>0, \forall j=1, \cdots, n \]


8. Back to Computing SVD

In the following:

\[ A A^{T}=U \Sigma V^{T} V \Sigma^{T} U^{T}=U \Sigma \Sigma^{T} U^{T}=U \Sigma^{2} U^{T} \]
\[ A^{T} A=V \Sigma^{T} U^{T} U \Sigma V^{T}=V \Sigma^{T} \Sigma U^{T}=V \Sigma^{2} V^{T} \]

Can we prove that both \(A A^{T}\) and \(A^{T} A\) are symmetric positive-(semi-)definite?

Symmetric: \(\left(A A^{T}\right)^{T}=A A^{T} \text { and }\left(A^{T} A\right)^{T}=A^{T} A\)

Positive-(semi-)definite

  • \(\mathbf{x}^{T} A A^{T} \mathbf{x}=\left(A^{T} \mathbf{x}\right)^{T}\left(A^{T} \mathbf{x}\right)=\left\|A^{T} \mathbf{x}\right\|^{2} \geq 0\)
  • \(\mathbf{x}^{T} A^{T} A \mathbf{x}=(A \mathbf{x})^{T}(A \mathbf{x})=\|A \mathbf{x}\|^{2} \geq 0\)

Thus, we can find

  • Orthogonal eigenvector matrices \(U\) and \(V\).
  • Eigenvalues in \(\Sigma^{2}\) that are all positive.


9. Things to Note

Given any rectangular matrix \(A \in \mathbb{R}^{{m \times n}}\), its SVD always exists.

Given a square matrix \(A \in \mathbb{R}^{m \times n}\), its eigendecomposition does not always exist, but its SVD always exists.

Given a square, symmetric positive (semi-)definite matrix \(S \in \mathbb{R}^{n \times n}\), its eigendecomposition always exists, and it is actually the same as its SVD.


References