Not using GPU for trainNetwork?
25 views (last 30 days)
I am looking to train a network using the trainNetwork command. I have set up the network, options, and data. I have installed the Parallel Processing Toolbox and my GPU is NVIDIA QuadroM1000M with compute capability 5.0 (which should be enough compute capability per https://www.mathworks.com/help/parallel-computing/gpu-support-by-release.html). It was suggested in the Matlab Deep Learning Onramp that the GPU would be automatically used if I had the processing toolbox and my GPU was compatible. However, when running trainNetwork() it does not use the GPU. Using code gpuDeviceTable returns nothing. Does this suggest my GPU actually is not compatible or is there some other way I can access it?
Joss Knight on 24 Mar 2022
Nearly always in cases like this you just need to install your GPU drivers from NVIDIA: https://www.nvidia.co.uk/Download/driverResults.aspx/187247/en-uk
However with laptops there are sometimes also issues with ensuring that your system is allowing MATLAB to access the GPU, because power limitations often restrict GPU use to particular applications. You can try opening the NVIDIA Control Panel and making sure MATLAB is enabled under the "Manage 3D Settings -> Program Settings". Your system may have slightly different ways of managing this.
After you've done this run gpuDevice and hopefully your device will appear.
I should warn you that often the performance results on laptop GPUs (especially old ones like yours) are not that great compared to the CPU.
More Answers (1)
yanqi liu on 23 Mar 2022
yes，sir，may be use
to check your ExecutionEnvironment
or in train option，set
to make the device type when training