# Standard PCA code, finidng the eigenvalues of a non square matrix

8 visualizzazioni (ultimi 30 giorni)
Neo il 22 Dic 2015
Commentato: Star Strider il 1 Gen 2016
In a PCA algorithm, you need to find the eigenvalues of a covariance matrix in order to derive the 1st algebraic solution to PCA using linear algebra. Quick synopsis:
X is a data set (mxn).
mxn are not always different, m is # of measurement types, n is the # of sampless.
we find some orthonormal matrix P where Y = PX such that the covariance matrix C (matrix of variances between two data sets) = (normalization constant)*Y*Y transpose is diagonalized.
Now I need to find the eigenvalues of the covariance matrix but in this code, the matrix appears to not be square, which can not be if this code needs to do fulfill its purpose. Can someone tell me, who may or may not be familiar with PCA algorithm what am I misunderstanding or have incorrect? The problem is in the very last line of the code, previous is included for background.
PCA Code:
[irow, icol] = size(I);
data = reshape(I',irow*icol,1);
% [sR ,sC ,eR ,eC] = deal(1, 3, 2, 4);
% Compute the sum over the region using the integral image.
% PCA1: Perform PCA using covariance.
% data - MxN matrix of input data
% (M dimensions, N trials)
% signals - MxN matrix of projected data
% PC - each column is a PC
% V - Mx1 matrix of variances
[M,N] = size(data);
% subtract off the mean for each dimension
mn = mean(data,2);
data = double(data)
data = data - repmat(mn,1,N);
% calculate the covariance matrix
covariance = 1 / (N-1) * (data) .* (data);
% find the eigenvectors and eigenvalues
[PC, V] = eig(covariance);
##### 11 CommentiMostra 9 commenti meno recentiNascondi 9 commenti meno recenti
jgg il 1 Gen 2016
Modificato: jgg il 1 Gen 2016
Again, I still don't think you understand what you're trying to do properly. I'm pretty sure at this point: data = data - repmat(mn,1,N); you will have a matrix of zeroes.
Basically, you're trying to calculate the covaraince of a single observation; this doesn't make sense.
What you want to do is something like this:
[irow, icol] = size(I);
[irow2, icol2] = size(I2); %this needs to be the same as I
d = reshape(I',irow*icol,1);
d2 = reshape(I2',irow*icol,1);
data = [d, d2];
mn = mean(data,2);
[M,N] = size(data);
data = data - repmat(mn,1,N);
covariance = 1 / (M-1) * (data')*(data);
You can then calculate the eigenvectors from a transform of this, or from 1 / (N-1) * (data)*(data'); (which is usually not a good idea for images)
You probably want way more than two images, though. It's pretty impossible to isolate the principle components of a dataset with two observations.
I think that this lecture is a very clear illustration of the procedure if you want a reference.
Star Strider il 1 Gen 2016
This seems to be a related problem.

Accedi per commentare.

### Categorie

Scopri di più su Dimensionality Reduction and Feature Extraction in Help Center e File Exchange

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by