SVM Cross Validation Training
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
I am using K-Fold cross validation. My K is 10.
I am supposed to do 10 crossfold and take the average of the SVM performance.
How should i perform such? Running the cross validarion ounce only generates 1 fold prediction or a complete 10-fold prediction?
1 Commento
Mohammad Sami
il 8 Mag 2020
According to the documentation it is average over all folds
https://www.mathworks.com/help/releases/R2020a/stats/select-data-and-validation-for-classification-problem.html
Risposte (1)
Gayathri
il 3 Gen 2025
I understand that you need to perform K-fold cross-validation for a SVM model. For this purpose you can use the "crossval" function. And then, "kfoldLoss" function can be used to get the classification loss for cross-validated classification model. Please refer to the code below which implements the same.
load ionosphere
%Train a SVM classifier using the radial basis kernel
SVMModel = fitcsvm(X,Y,'Standardize',true,'KernelFunction','RBF','KernelScale','auto');
%Cross-validate the SVM classifier
CVSVMModel = crossval(SVMModel);
%Estimate the out-of-sample misclassification rate.
classLoss = kfoldLoss(CVSVMModel)
"crossval" by default uses 10-fold cross-validation.
Please refer to the "Train and Cross-Validate SVM Classifier" example in the documentation link mentioned below.
Hope you find this information helpful!
0 Commenti
Vedere anche
Categorie
Scopri di più su Gaussian Process Regression in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!