Not using GPU for trainNetwork?
Mostra commenti meno recenti
Hello,
I am looking to train a network using the trainNetwork command. I have set up the network, options, and data. I have installed the Parallel Processing Toolbox and my GPU is NVIDIA QuadroM1000M with compute capability 5.0 (which should be enough compute capability per https://www.mathworks.com/help/parallel-computing/gpu-support-by-release.html). It was suggested in the Matlab Deep Learning Onramp that the GPU would be automatically used if I had the processing toolbox and my GPU was compatible. However, when running trainNetwork() it does not use the GPU. Using code gpuDeviceTable returns nothing. Does this suggest my GPU actually is not compatible or is there some other way I can access it?
Thank you
Risposta accettata
Più risposte (1)
yanqi liu
il 23 Mar 2022
0 voti
yes,sir,may be use
>> gpuDevice
to check your ExecutionEnvironment
or in train option,set
'ExecutionEnvironment','gpu'
'ExecutionEnvironment','cpu'
to make the device type when training
Categorie
Scopri di più su Parallel and Cloud in Centro assistenza e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!