GPU supoort in Multilabel Example?

1 visualizzazione (ultimi 30 giorni)
Moritz Scherrmann
Moritz Scherrmann il 19 Giu 2020
Hi all,
I am currently working with the following matlab example:
https://de.mathworks.com/help/deeplearning/ug/multilabel-text-classification-using-deep-learning.html
My problem is that even though the "canUseGPU" function returns true (I am working on a Geforce RTX 2080 Ti on my local machine),
I think that the example does not run on the GPU. However, In the example, it is stated that it should run on the GPU (line 181 - 183 in the live script):
% If training on a GPU, then convert data to gpuArray.
if (executionEnvironment == "auto" && canUseGPU) || executionEnvironment == "gpu"
dlX = gpuArray(dlX);
end
When i run the code and start training, the task manager tells that there is no activity on the GPU.
Furthermore, I saw that the following embedding function removes the gpu-Arrray property of the data X (line 387-397):
function Z = embedding(X, weights)
% Reshape inputs into a vector.
[N, T] = size(X, 2:3);
X = reshape(X, N*T, 1);
% Index into embedding matrix.
Z = weights(:, X);
% Reshape outputs by separating batch and sequence dimensions.
Z = reshape(Z, [], N, T);
end
I assume that that leads to a CPU usage instead of the GPU.
Can anybody halp me to solve the issue?
Thank you very much!

Risposte (0)

Categorie

Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by