GPU Memory for dlconv
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
Hi, why does matlab need 1GB of GPU memory for training a network with a 0.128 GB activation output of dlconv with 0.01GB input? The number of learnable parameters are 16×5×5×3. I got this result from debugging the code with a break point at the dlconv-line. Before executing dlconv I noted the GPU memory. Then I stepped over (executed) the dlconv. I noted down GPU memory again. The difference in GPU memory is 1GB. Is there a way to calculate how much GPU memory matlab will use during training? Best, David
0 Commenti
Risposte (1)
R
il 14 Mar 2024
Hi David,
The memory usage of MATLAB during training can be influenced by various factors, including the size of the input data, the number of learnable parameters, and the specific operations performed by the network. Other factors, such as model size, batch size and forward/backward pass memory also influence the GPU memory usage.
Every function is different in the amount of working memory it needs to run. There really isn't any way to estimate the memory requirements other than running the function and monitoring GPU memory in a separate process.
The following MATLAB Answer illustrates a crude estimate of GPU memory requirement for a deep learning AI model training:
Following this example might help you in estimating the memory for "dlconv"!
0 Commenti
Vedere anche
Categorie
Scopri di più su GPU Computing in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!