fitrgp: Custom kernel Function

18 visualizzazioni (ultimi 30 giorni)
Abhinav Gupta
Abhinav Gupta il 1 Mar 2022
Risposto: Vatsal il 20 Ott 2023
I want to using Gaussian processes to dataset using a custom kernel. My kernel has one parameter 'beta' that I want to optimize. The problem is that the parameter 'beta' is not changing at all during the optimization; it stays at the inital values. The code that I am using is as follows:
gprMdl = fitrgp(Xt, yt, 'KernelFunction', @(Xm,Xn,theta)CovKer(Xm,Xn,theta,Ds), 'KernelParameters', 100, ...
'BasisFunction', 'none', 'OptimizeHyperparameters', 'auto', 'FitMethod','sd', ...
'standardize', 0, 'ActiveSetSize', 1000, 'HyperparameterOptimizationOptions',struct('UseParallel',true));
In the example above, the intital vaue of the parameter of Kernel function is set to 100, and It stays at 100. I have tried to vary the intial parameter value but without any success. It may be that I am doing something wrong while specifying the name-value pairs of 'fitrgp' function.
Please let me know what is wrong here.

Risposte (1)

Vatsal il 20 Ott 2023
I understand that you are currently using the 'fitrgp' function and it seems like you are facing difficulties in optimizing the 'beta' parameter in your Gaussian process regression model. I have a few suggestions that might help you improve the optimization process:
  1. The initial value you set for the kernel parameter can sometimes have an impact on the optimization process. If the initial value is too far from the optimal value, the optimization algorithm might get stuck in a local minimum. It would be worth trying different initial values to see if that helps.
  2. You can have control over the optimization process by setting options in the 'HyperparameterOptimizationOptions' name-value pair argument. For instance, you can increase the number of iterations or function evaluations, or even consider changing the optimizer to 'bayesopt'.
  3. It is crucial to ensure that your custom kernel function is correctly defined and differentiable, as the optimization algorithm relies on computing gradients.
  4. Another consideration is setting 'Standardize' to true. This can sometimes enhance the numerical stability of the optimization process.
To learn more about “fitrgp” usage and syntax, you may refer to the MathWorks documentation link below: -
I hope this helps!




Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by