Is memory reduction not possible when training a neural network on a GPU?

4 visualizzazioni (ultimi 30 giorni)
When dealing with large datasets the amount of temporary storage needed during neural network training can be reduced by adding the syntax: “ 'reduction',N” to the training command.
This is a neat feature and would be particularly useful when training nets on graphics cards that generally have less memory compared to the system RAM, but when I add the command to a training done on my GPU the reduction seems to have no effect at all. It doesn’t produce any warning or error message, is simply does not reduce the memory usage of my GPU.
Is this feature not available with GPU training or do I need to do something different in order to make it work?
Thanks.

Risposte (1)

Nick Hobbs
Nick Hobbs il 5 Ago 2015
The documentation for the train function says that 'reduction' might be able to help with memory only if a MATLAB calculation is being used by the train function. It is possible that the train function is calling a MEX file, and then the reduction likely will not provide much change. This can be determined with 'showResources'. More information on 'showResources' can be found here.
  1 Commento
PetterS
PetterS il 5 Ago 2015
Yes, I’ve seen that article. But when I do the training on my GPU the resource isn’t Matlab nor MEX, it is simply reported as “CUDA”. I don’t know if CUDA calculations take place in Matlab or in MEX or if it counts as neither.

Accedi per commentare.

Categorie

Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by