Correct way of using kfoldLoss
Mostra commenti meno recenti
I am trying to do a crossvalidation using a K-nn classifier. In the past, I used cvpartition to achieve this, but I found kfoldLoss function recently and using it seems much easier.
So this is the code that I have where I am using fitcknn to classify breast data (from NIPS) and then want to do 10 fold CV. My question is that when I do kfoldLoss, is it running 10-fold CV where it re-trains and tests on CV partitioned data for each fold, or is using the trained fitcknn 'Mdl' and just using that same trained classifier again and again. And if it does knn again for each partition, do i need to use fitcknn for the complete data because that just seems of no use.
Mdl = fitcknn(breast.sel, breast.labels,'NumNeighbors', 3,'KFold',10);
kl = kfoldLoss(Mdl)
Risposte (1)
Carl
il 10 Ott 2017
0 voti
The documentation page below has a good explanation of how kFold* functions work on cross-validated models:
To answer your questions,
- kFoldLoss will use an already trained (cross-validated) model
- The loss is calculated on the validation data for each fold in the cross-validated model
1 Commento
Ali Yar Khan
il 5 Feb 2020
i want to get the total test instances of each fold as well as no of misclassified and correct classified from the model how i can do that?
Categorie
Scopri di più su Nearest Neighbors in Centro assistenza e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!