objective function in Bayesian Optimization Algorithm like fitrsvm and fitrgp
6 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hello,
What is the mathematical objective function in the bayesian optimization algorithm? The explanation says that the algorithm like fitrsvm tries to minimize the log(1 + cross-validation loss) but what is the real mathematical formula?
Is it possible to change the objective function to just the MSE?
Thank you!
Dimitri
0 Commenti
Risposta accettata
Don Mathis
il 13 Mag 2019
This page says that the loss defaults to MSE. So that's the loss that's used in the log(1+cvloss) formula. Cross validated loss is the loss summed over all the held-out validation sets. The default when using optimization is 5-fold cross-validation.
There's not an option to change the hyperparameter optimization objective function from log(1+cvloss). You would need to edit the source code to do that. The source file is matlab\toolbox\stats\classreg\+classreg\+learning\+paramoptim\createObjFcn.m. Look for the call to the log1p function.
3 Commenti
Don Mathis
il 14 Mag 2019
Modificato: Don Mathis
il 14 Mag 2019
Because loss(Mdl,X,Y) is the loss of the final model on the full dataset, while the MinObjective is the log of 1 plus the out-of-sample cross-validated loss. See the kfoldLoss method for documentation of that. If you used 5-fold cross-validation, the kfoldLoss is the summed loss of 5 different models, each on 1/5 of the dataset. It is not the loss of the final model on the full dataset.
antlhem
il 29 Mag 2021
Hi, Could take a look into my question? https://uk.mathworks.com/matlabcentral/answers/842800-why-matlab-svr-is-not-working-for-exponential-data-and-works-well-with-data-that-fluctuates?s_tid=prof_contriblnk
Più risposte (0)
Vedere anche
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!