Using GPU, multiply a 3D matrix by a 2D matrix (slicewise)
2 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Brad Hesse
il 26 Lug 2016
Commentato: Edric Ellis
il 27 Lug 2016
Hi everyone,
I am trying to vectorize the code for my neural network so that it can be quickly executed on the GPU.
I need to multiply each slice of 3D matrix X by a 2D matrix T.
X = 3D matrix.
T = 2D matrix.
Is there a good, fast way to do this using the GPU? I have heard it suggested to use 'repmat' to create a 3D matrix from the 2D matrix (duplicating it many times). But it feels wasteful and inefficient.
Thanks!
0 Commenti
Risposta accettata
Edric Ellis
il 26 Lug 2016
X = rand(10, 10, 4, 'gpuArray');
T = rand(10, 'gpuArray');
pagefun(@mtimes, X, T)
2 Commenti
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Parallel and Cloud in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!