kfoldloss and regression machine learning like fitrsvm

1 visualizzazione (ultimi 30 giorni)
Hello,
I want to calculate the cross validation loss of my different regression machine learning models to compare them with each other. Therefor I want to use kfoldLoss, but I´m getting an error.
My code looks as follows:
%splitting data
[m,n] = size(Daten) ;
P = 0.8 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain=Training(:,1:n-1);
YTrain=Training(:,n);
XTest=Testing(:,1:n-1);
YTest=Testing(:,n);
%Hyperparameter optimization
rng default
c = cvpartition(YTrain,'KFold',10);
Mdl = fitrsvm(XTrain,YTrain,'KernelFunction','gaussian','OptimizeHyperparameters','Epsilon',...
'HyperparameterOptimizationOptions',struct('AcquisitionFunctionName',...
'expected-improvement-plus','cvpartition',c));
L=kfoldLoss(Mdl)
I want to use this code structure for the different functions like fitrtree and the other regression functions in the bayesian optimization workflow. Why does kfoldLoss not work for this code?
Best regards, Dimitri

Risposte (1)

Don Mathis
Don Mathis il 7 Nov 2018
When you call fitrsvm with 'OptimizeHyperparameters', the result is a single svm model, not a partitioned model with a kfoldLoss method. To get an estimate of the out-of-sample loss of your final model, you'll need to run the crossval function on it and then call kfoldLoss on the result of that:
pm = crossval(Mdl, 'cvpartition', c)
kfoldLoss(pm)
Or, since there's no need to reuse the cvpartition that you used for the optimization,
pm = crossval(Mdl, 'KFold', 10)
kfoldLoss(pm)

Prodotti


Release

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by