Using SVD for Dimensionality Reduction
17 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Serra Aksoy
il 29 Mar 2021
Risposto: Mahesh Taparia
il 2 Apr 2021
Hello everyone.
I have a matrix that has 300 rows(samples) and 5000 columns(features).
I need to reduce the number of columns for classification.
As far as I know for using pca() function the number of samples should be greater than the number of features.
So I try to use Singular Value Decomposition function with below codes.
%Singular value decomposition of X;
[U, Sig, V]=svd(X);
sv=diag(Sig)
%for the distribution of singular values;
figure;
sv=sv/sum(sv);
stairs(cumsum(sv));
xlabel('singular values');
ylabel('cumulative sum');
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/566769/image.png)
I have two questions.
1) As i understand from the above figure i have to take approximately 250 singular values that it counts for 95% of my data.
So should I take first 250 singular values for creating a new data for classification?
How can i see the variance of each principal components like in pca() functions explained matrix to decide how many of them should i use?
2) After defining the number of principal components, I need to create a new matrix for classification.
Can I do this with below code? (for example with first two principal components)
new_matrix_for_classification = X*(V:,1:2);
Thanks in advance.
0 Commenti
Risposta accettata
Mahesh Taparia
il 2 Apr 2021
Hi
For 2nd part, you can use the function pca to directly calculate the input with principal components. For example, in your case if you want 1st 2 components, then:
[coeff,score,latent] = pca(X);
new_matrix_for_classification = score(:,1:2); %score is representation in new space
Hope it will help!
0 Commenti
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Dimensionality Reduction and Feature Extraction in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!