How to use gpu for deep learning
8 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
I’m training detection model yolov4 on matlab. I just got a computer with a graphic card, the Nvidia GeForce RTX 3070 Ti. I want to get the maximum from it. Please help me, what I need to write in matlab code to perform training using GPU.
0 Commenti
Risposte (1)
KSSV
il 1 Ott 2022
Example:
options = trainingOptions('sgdm', ...
'Momentum',0.9, ...
'InitialLearnRate',initLearningRate, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',learningDropPeriod, ...
'LearnRateDropFactor',learningRateFactor, ...
'L2Regularization',l2reg, ...
'ExecutionEnvironment', 'auto',....
'ValidationPatience',Inf,...
'MaxEpochs',maxEpochs, ...
'ValidationData',{inputVal, targetVal}, ...
'ValidationFrequency',50,...
'shuffle','every-epoch',....
'MiniBatchSize',miniBatchSize, ...
'GradientThresholdMethod','l2norm', ...
'GradientThreshold',0.01, ...
'Plots','training-progress', ...
'ExecutionEnvironment', 'auto',..... %<------ check this. Keep it auto so MATLAB can pick the best
'ValidationPatience', 10, ...
'Verbose',true);
0 Commenti
Vedere anche
Categorie
Scopri di più su GPU Computing in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!