How to use gpu for deep learning

8 visualizzazioni (ultimi 30 giorni)
Alexey Kozhakin
Alexey Kozhakin il 1 Ott 2022
Risposto: KSSV il 1 Ott 2022
I’m training detection model yolov4 on matlab. I just got a computer with a graphic card, the Nvidia GeForce RTX 3070 Ti. I want to get the maximum from it. Please help me, what I need to write in matlab code to perform training using GPU.

Risposte (1)

KSSV
KSSV il 1 Ott 2022
Check the trainingOptions, in there you have option to specify the execution environment.
Example:
options = trainingOptions('sgdm', ...
'Momentum',0.9, ...
'InitialLearnRate',initLearningRate, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',learningDropPeriod, ...
'LearnRateDropFactor',learningRateFactor, ...
'L2Regularization',l2reg, ...
'ExecutionEnvironment', 'auto',....
'ValidationPatience',Inf,...
'MaxEpochs',maxEpochs, ...
'ValidationData',{inputVal, targetVal}, ...
'ValidationFrequency',50,...
'shuffle','every-epoch',....
'MiniBatchSize',miniBatchSize, ...
'GradientThresholdMethod','l2norm', ...
'GradientThreshold',0.01, ...
'Plots','training-progress', ...
'ExecutionEnvironment', 'auto',..... %<------ check this. Keep it auto so MATLAB can pick the best
'ValidationPatience', 10, ...
'Verbose',true);

Prodotti


Release

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by