Using GPU, multiply a 3D matrix by a 2D matrix (slicewise)

2 visualizzazioni (ultimi 30 giorni)
Hi everyone,
I am trying to vectorize the code for my neural network so that it can be quickly executed on the GPU.
I need to multiply each slice of 3D matrix X by a 2D matrix T.
X = 3D matrix.
T = 2D matrix.
Is there a good, fast way to do this using the GPU? I have heard it suggested to use 'repmat' to create a 3D matrix from the 2D matrix (duplicating it many times). But it feels wasteful and inefficient.
Thanks!

Risposta accettata

Edric Ellis
Edric Ellis il 26 Lug 2016
In this case, you can use pagefun. For example:
X = rand(10, 10, 4, 'gpuArray');
T = rand(10, 'gpuArray');
pagefun(@mtimes, X, T)
  2 Commenti
Brad Hesse
Brad Hesse il 27 Lug 2016
Oh my god! I am speechless! My entire forward/backward propagation algorithm worked, on the very first try, after completely re-writing it to be vectorized for GPU execution. This is at least a 40-50 fold speed improvement (my original code obviously wasn't even very well optimized for CPU execution).
I cannot believe how fast this is.
Thank you so much for your help Edric. I had actually already tried using the pagefun function, but it failed and I assumed it didn't work with 3D matrices x 2D matrices.

Accedi per commentare.

Più risposte (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by