Regarding Bayesian optimization for Support vector regression

1 visualizzazione (ultimi 30 giorni)
Hello,
I am using Support vector regression to develop a model and for my data set to reduce the cross validation losses am using bayesian optimization technique. here i am using automatic hyperparameter tuning is using also am using 5-fold and 10-fold cross validation loss minimization. i tried with and without optimization technique. am getting goot model after optimizing the parameters but when i used the tuned parameter train the complete data set after am getting good result when am validate with test data set. but when i check the respective cross validation loss for this final trained model it is not matching with the loss obtained during optimization. both the loss difference is too high for 5-fold but when i continue this for different folds 7, 9 and 10 folds am getting low CV losses also these losses not matching with the losses obtained during hyperparameters that minimize five-fold cross-validation loss by using automatic hyperparameter optimization.
Please any one help me regarding this. If required what ever the code and functions am using i'll share. please help regarding this how can i select the best model and what ever the step i followed is right or wrong below i given the steps followed to develop a model.
1) i trained the model without partitioning the data and later i used same data set for validation. during optimization also i used 5-fold and 10-fold cross validation loss minimization for the complete data set.
2) first i partition the data set into training and testing next i used trained set set to develop a model with and without hyperparameter optimization techniques later i used testing set to validate the model. but in this case when i used 5-fold cross / 10-fold cross validation without optimization am getting high error but when i used bayesian optimization for hyperparameter tuning. After the optimization am with the help of parameter i trained the complete model and tested with testing data set but when i used the cross validation for this data set again am getting large errors. Similarly i followed the steps for both 1st and 2nd step from 2 to 10 folds each part am getting high losses after optimizing the model for respective model but when i compare the models with above two section 7, 9 and 10 fold cross validation losses able to reduce the cross validation losses when i used tuned hyperparameters.

Risposte (1)

Don Mathis
Don Mathis il 4 Set 2018
"...when i check the respective cross validation loss for this final trained model it is not matching with the loss obtained during optimization."
I suspect that this occurs because the Objective function used during hyperparameter optimization in regression is Obj=log(1+Loss). See the Documentation here . So the reported objective function during optimization will be smaller than the Loss. The software does this because it usually makes it easier for the Gaussian Process models to fit the observed points.
To convert the Objective function value to the Loss, use Loss=exp(Obj)-1.
Does that explain what you're observing?
  2 Commenti
sanjeev kumar T M
sanjeev kumar T M il 9 Set 2018
Thanks Don Mathis,
to find a best fit model i found that first used the optimization technique to find the hyperparameters and later i used the obtained hyperparameters to train the model for complete training set and used the test set to validate the model and finally again cross validate the final model but this time am not used the same partitioned random numbers what i used for tuning hyperparameters rather than that i used same k-fold (but this selects different random numbers) used for minimizing cross validation losses and find the CV losses depending on these losses selected the model if these losses are low otherwise i used different folds if this method is right to select the best fit model for our application. if i used the same partition data what i used for optimization technique to cross validate the final model am getting same losSes what am getting during optimization.
please help us regarding this.
Don Mathis
Don Mathis il 10 Set 2018
I don't understand what you mean. It would help if you posted complete reproduction steps.

Accedi per commentare.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by