MATLAB Answers

Fab
0

How to extract hyper parameters during Bayesian optimization

Asked by Fab
on 1 Mar 2019
Latest activity Commented on by Don Mathis on 4 Mar 2019
Hi all,
I am new to using the bayesopt Matlab function and I was trying to test it on a toy problem.
I realize that bayesopt uses the "ardmatern52" Kernel function, which allows different length scales for multiple hyperparameters, and I wanted to have access to estimates of hyper parameters produced by the Bayesian Optimization function. In my understanding, these estimates are produced after evals by the fitrgp function; however, it seems that they somehow get lost and become unaccessible when a call to bayesopt is made. Any idea on how to access these estimates at the end of a Bayesian Optimization?
Currently working with R2016b
Thanks,
Fab

  0 Comments

Sign in to comment.

Products


Release

R2016b

1 Answer

Answer by Don Mathis on 1 Mar 2019
 Accepted Answer

There are many hidden properties in the BayesianOptimization object that is returned by bayesopt. One of them is ObjectiveFcnGP, which is the last Gaussian Process model that was fit to the observed function evaluation data. Here's an example of how to get the kernel parameters from that model (using R2018b):
Run bayesopt:
load ionosphere
rng default
num = optimizableVariable('n',[1,30],'Type','integer');
dst = optimizableVariable('dst',{'chebychev','euclidean','minkowski'},'Type','categorical');
c = cvpartition(351,'Kfold',5);
fun = @(x)kfoldLoss(fitcknn(X,Y,'CVPartition',c,'NumNeighbors',x.n,...
'Distance',char(x.dst),'NSMethod','exhaustive'));
results = bayesopt(fun,[num,dst],'Verbose',0,...
'AcquisitionFunctionName','expected-improvement-plus')
Get the final GP model:
gp = results.ObjectiveFcnGP
Get the kernel parameters from that:
gp.KernelInformation % look at kernel information
kparams = gp.KernelInformation.KernelParameters
You can see all the properties (hidden, private or otherwise) by doing this:
s = struct(results)
Then you can access what you want.

  2 Comments

Thanks that was very helpful. Did not know about hidden properties!
However, with your method, I am able to only get the final esimate of kernel parameters in the variable kparams that you defined above.
How could one go about getting the estimate of each parameter at each iteration?
This is a function handle, and it is called after every iteration. Its argument is the BayesianOptimization object at that iteration. You could, for example, have it append to a global variable each time it is called.

Sign in to comment.