How to force enable GPU usage in fitrgp
8 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
When i am using Regression learner app , and select 'Use Parallel' option for training, i can see my Nvidia GPU ( compute 7.2) being used.
But when i generate function from it and try to run from script, it wont, Can we set something in script to use GPU from script.
i tried Gpuarrays and tall array and both are not supported by fitrgp.
regressionGP = fitrgp(...
(X), ...
(Y), ...
'BasisFunction', 'constant', ...
'KernelFunction', 'exponential', ...
'Standardize', true,...
'OptimizeHyperparameters','auto',...
'HyperparameterOptimizationOptions',struct(...
'Verbose',1,...
'UseParallel',true));
3 Commenti
Walter Roberson
il 8 Apr 2023
In MATLAB Answers, each user can communicate in whatever language they feel most comfortable communicating in. If a reader has difficulty understanding, then the reader can ask for clarification of particular parts... or the reader can move on to other questions.
There is no requirement that people post in English -- and if they do post in English then it is fine if they used a machine translation that might get words or capitalization or contractions wrong compared to "perfect" English. We are here for Mathworks products, not for complaining about typographic mistakes.
Risposta accettata
Ive J
il 7 Apr 2023
fitrgp does not [yet] support GPU arrays. You can easily scroll down the doc page and check "Extended Capabilities" for each function. UseParallel as the name suggests, will invoke parallel computations.
4 Commenti
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Gaussian Process Regression in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!