Deep learning:十一(PCA和whitening在二維數據中的練習)


 

  前言:

  這節主要是練習下PCA,PCA Whitening以及ZCA Whitening在2D數據上的使用,2D的數據集是45個數據點,每個數據點是2維的。參考的資料是:Exercise:PCA in 2D。結合前面的博文Deep learning:十(PCA和whitening)理論知識,來進一步理解PCA和Whitening的作用。

 

  matlab某些函數:

  scatter:

  scatter(X,Y,<S>,<C>,’<type>’);
  <S> – 點的大小控制,設為和X,Y同長度一維向量,則值決定點的大小;設為常數或缺省,則所有點大小統一。
  <C> – 點的顏色控制,設為和X,Y同長度一維向量,則色彩由值大小線性分布;設為和X,Y同長度三維向量,則按colormap RGB值定義每點顏色,[0,0,0]是黑色,[1,1,1]是白色。缺省則顏色統一。
  <type> – 點型:可選filled指代填充,缺省則畫出的是空心圈。

  plot:

  plot可以用來畫直線,比如說plot([1 2],[0 4])是畫出一條連接(1,0)到(2,4)的直線,主要點坐標的對應關系。

 

  實驗過程:

  一、首先download這些二維數據,因為數據是以文本方式保存的,所以load的時候是以ascii碼讀入的。然后對輸入樣本進行協方差矩陣計算,並計算出該矩陣的SVD分解,得到其特征值向量,在原數據點上畫出2條主方向,如下圖所示:

   

  二、將經過PCA降維后的新數據在坐標中顯示出來,如下圖所示:

   

  三、用新數據反過來重建原數據,其結果如下圖所示:

   

  四、使用PCA whitening的方法得到原數據的分布情況如:

   

  五、使用ZCA whitening的方法得到的原數據的分布如下所示:

   

  PCA whitening和ZCA whitening不同之處在於處理后的結果數據的方差不同,盡管不同維度的方差是相等的。

 

  實驗代碼:

close all

%%================================================================
%% Step 0: Load data
%  We have provided the code to load data from pcaData.txt into x.
%  x is a 2 * 45 matrix, where the kth column x(:,k) corresponds to
%  the kth data point.Here we provide the code to load natural image data into x.
%  You do not need to change the code below.

x = load('pcaData.txt','-ascii');
figure(1);
scatter(x(1, :), x(2, :));
title('Raw data');


%%================================================================
%% Step 1a: Implement PCA to obtain U 
%  Implement PCA to obtain the rotation matrix U, which is the eigenbasis
%  sigma. 

% -------------------- YOUR CODE HERE -------------------- 
u = zeros(size(x, 1)); % You need to compute this
[n m] = size(x);
%x = x-repmat(mean(x,2),1,m);%預處理,均值為0
sigma = (1.0/m)*x*x';
[u s v] = svd(sigma);


% -------------------------------------------------------- 
hold on
plot([0 u(1,1)], [0 u(2,1)]);%畫第一條線
plot([0 u(1,2)], [0 u(2,2)]);%第二條線
scatter(x(1, :), x(2, :));
hold off

%%================================================================
%% Step 1b: Compute xRot, the projection on to the eigenbasis
%  Now, compute xRot by projecting the data on to the basis defined
%  by U. Visualize the points by performing a scatter plot.

% -------------------- YOUR CODE HERE -------------------- 
xRot = zeros(size(x)); % You need to compute this
xRot = u'*x;


% -------------------------------------------------------- 

% Visualise the covariance matrix. You should see a line across the
% diagonal against a blue background.
figure(2);
scatter(xRot(1, :), xRot(2, :));
title('xRot');

%%================================================================
%% Step 2: Reduce the number of dimensions from 2 to 1. 
%  Compute xRot again (this time projecting to 1 dimension).
%  Then, compute xHat by projecting the xRot back onto the original axes 
%  to see the effect of dimension reduction

% -------------------- YOUR CODE HERE -------------------- 
k = 1; % Use k = 1 and project the data onto the first eigenbasis
xHat = zeros(size(x)); % You need to compute this
xHat = u*([u(:,1),zeros(n,1)]'*x);


% -------------------------------------------------------- 
figure(3);
scatter(xHat(1, :), xHat(2, :));
title('xHat');


%%================================================================
%% Step 3: PCA Whitening
%  Complute xPCAWhite and plot the results.

epsilon = 1e-5;
% -------------------- YOUR CODE HERE -------------------- 
xPCAWhite = zeros(size(x)); % You need to compute this
xPCAWhite = diag(1./sqrt(diag(s)+epsilon))*u'*x;



% -------------------------------------------------------- 
figure(4);
scatter(xPCAWhite(1, :), xPCAWhite(2, :));
title('xPCAWhite');

%%================================================================
%% Step 3: ZCA Whitening
%  Complute xZCAWhite and plot the results.

% -------------------- YOUR CODE HERE -------------------- 
xZCAWhite = zeros(size(x)); % You need to compute this
xZCAWhite = u*diag(1./sqrt(diag(s)+epsilon))*u'*x;

% -------------------------------------------------------- 
figure(5);
scatter(xZCAWhite(1, :), xZCAWhite(2, :));
title('xZCAWhite');

%% Congratulations! When you have reached this point, you are done!
%  You can now move onto the next PCA exercise. :)

 

 

  參考資料:

     Exercise:PCA in 2D

     Deep learning:十(PCA和whitening)

 

 

 

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM