Eigen Decomposition, SVD, PCA
Eigen Decompositon
A (non-zero) vector $v$ of dimension $N$ is an eigenvector of a square $(N×N)$ matrix $A$ if it satisfies the linear equation:
$Av = \lambda v$
A complex normal matrix $A$ has an orthogonal eigenvector basis can be decomposed as
$A = U \Lambda U^{\ast}$
for every $N×N$ real symmetric matrix, the eigenvalues are real and the eigenvectors can be chosen such that they are orthogonal to each other can be decomposed as
$A = Q \Lambda Q^{-1}$
SVD
假设$C$是$M\times N$矩阵,
- $U$是$M\times M$矩阵,$U$的列为$CC^{T}$的正交特征向量,
- $V$是$N\times N$矩阵,$V$的列为$C^{T}C$的正交特征向量。
再假设$r$为$C$矩阵的秩,则存在奇异值分解:
$C=U\Sigma V^{T}$
$CC^{T}$和$C^{T}C$的特征值相同。$\Sigma$为$M\times N$,$\Sigma_{ii}=\sqrt{\lambda_{i}}$其余位置为0。
PCA
PCA是基于SVD的算法
PCA过程为:
- 特征中心化。即每一维的数据都减去该维的均值。
- 计算协方差矩阵C
- 计算协方差矩阵C的特征值和特征向量
- 选取大的特征值对应的特征向量,得到新的数据集
不仅仅是对高维数据进行降维,更重要的是经过降维去除了噪声,发现了数据中的模式reference
eigen decomposition: https://en.wikipedia.org/wiki/Eigendecomposition_of_a_matrix
SVD: http://blog.csdn.net/wangran51/article/details/7408414/
http://www.cnblogs.com/LeftNotEasy/archive/2011/01/19/svd-and-applications.html
PCA:http://www.cnblogs.com/zhangchaoyang/articles/2222048.html