Best way to integrate GPU use in my code?

6 visualizzazioni (ultimi 30 giorni)
AlexRD
AlexRD il 18 Mag 2021
Commentato: Infinite_king il 18 Apr 2024 alle 9:06
I've started doing a lot of work on a neural net implementation i've built from scratch using Matlab, and initially changed from using GPU to using CPU only as it was easier to debug and write code for, and would allow me to focus on the GPU aspect of it later.
I am now however on the GPU implementation part, but struggling a bit to get an optimized result. I noticed that the GPU struggles a lot with multiple layers, with the processing time often being directly proportional to how many layers I have, whereas the CPU doesn't really care about number of layers (as long as amount of neurons aren't crazy high) but struggles a bit with the input layer, considering the amount of weights and biases.
I've tried a hybrid approach, where the input and any convolutional layers are assigned to the GPU, the GPU data is then fetched and processed by the CPU. But often the fetch time isn't worth the hassle.
Some feedback would be very welcome, and my project can be found here, fully documented: https://github.com/AlexRDX/Neural-Net
Or attached to this post. Any criticism at all is welcome.
Thank you for your time!

Risposte (0)

Categorie

Scopri di più su Deep Learning Toolbox in Help Center e File Exchange

Prodotti


Release

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by