MATLAB Answers

How to extract hyper parameters during Bayesian optimization

3 views (last 30 days)
Fab on 1 Mar 2019
Commented: Don Mathis on 4 Mar 2019
Hi all,
I am new to using the bayesopt Matlab function and I was trying to test it on a toy problem.
I realize that bayesopt uses the "ardmatern52" Kernel function, which allows different length scales for multiple hyperparameters, and I wanted to have access to estimates of hyper parameters produced by the Bayesian Optimization function. In my understanding, these estimates are produced after evals by the fitrgp function; however, it seems that they somehow get lost and become unaccessible when a call to bayesopt is made. Any idea on how to access these estimates at the end of a Bayesian Optimization?
Currently working with R2016b


Sign in to comment.

Accepted Answer

Don Mathis
Don Mathis on 1 Mar 2019
There are many hidden properties in the BayesianOptimization object that is returned by bayesopt. One of them is ObjectiveFcnGP, which is the last Gaussian Process model that was fit to the observed function evaluation data. Here's an example of how to get the kernel parameters from that model (using R2018b):
Run bayesopt:
load ionosphere
rng default
num = optimizableVariable('n',[1,30],'Type','integer');
dst = optimizableVariable('dst',{'chebychev','euclidean','minkowski'},'Type','categorical');
c = cvpartition(351,'Kfold',5);
fun = @(x)kfoldLoss(fitcknn(X,Y,'CVPartition',c,'NumNeighbors',x.n,...
results = bayesopt(fun,[num,dst],'Verbose',0,...
Get the final GP model:
gp = results.ObjectiveFcnGP
Get the kernel parameters from that:
gp.KernelInformation % look at kernel information
kparams = gp.KernelInformation.KernelParameters
You can see all the properties (hidden, private or otherwise) by doing this:
s = struct(results)
Then you can access what you want.


Fab on 2 Mar 2019
Thanks that was very helpful. Did not know about hidden properties!
However, with your method, I am able to only get the final esimate of kernel parameters in the variable kparams that you defined above.
How could one go about getting the estimate of each parameter at each iteration?
Don Mathis
Don Mathis on 4 Mar 2019
This is a function handle, and it is called after every iteration. Its argument is the BayesianOptimization object at that iteration. You could, for example, have it append to a global variable each time it is called.

Sign in to comment.

More Answers (0)




Translated by