SVD

Singular value decomposition

For Singular value decomposition, people always call it SVD. And it is a way to break up any m×nm\times n (rectangular ) matrix into 3 pieces:

A=UΣVTA=U\Sigma V^{T}

Where

UU is a m×mm\times m orthogonal matrix

Σ\Sigma is a m×nm\times n diagonal matrix, with singular value

VTV^{T} is a n×nn\times n orthogonal matrix

Noted that we can rewrite any square matrix into:

B=VΛVTB=V\Lambda V^T

Where VV and VTV^T is the eigenvalue vector for matrix BB and $\Lambda $ is the diagonal matrix with eigenvalue from top-left to bottom-right as large value to small value.

What is the difference between this two methods:

Let’s see

ATA=(VΣTUT)(UΣVT)=VΣTΣVTA^T A=(V\Sigma^T U^T)(U\Sigma V^{T})=V\Sigma^T \Sigma V^T

Since UU is the orthogonal matrix, UTU=IU^TU=I

And we can find the relation is that ATAA^T A is a square matrix and is positive definite since it is a square of a matrix AA . So the eigenvaues are greater or equal than 0. VV and VTV^T is the eigenvalue vector for ATAA^TA and ΣTΣ\Sigma^T \Sigma is the λ\lambda (eigenvalue) for ATAA^TA or the σ2\sigma ^2 for the ATAA^TA.

(Determinal is 0)= singular matrix, orthogonal matrix has determinant as 1, the product of Σ\Sigma is same as the determinant of AA

Only for square matrix that Ax=λxAx=\lambda x

Author: shixuan liu
Link: http://tedlsx.github.io/2019/09/04/svd/
Copyright Notice: All articles in this blog are licensed under CC BY-NC-SA 4.0 unless stating additionally.
Donate
  • Wechat
  • Alipay

Comment