Parallel Job causing memory leak?
2 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
I have converted a script into a parralel job with a pretty simple outline. Roughly this:
---------------------
sched = findResource('scheduler', 'type', 'local'); set(sched,'ClusterSize',6);
for a = 1:nLoops
job = createJob(sched);
% create 6 tasks, start schedular with script, etc.
% create sub-directories for each
% collect results
% kill job
end
-----------------
I have a 4 core machine with multi-threading and 12 GB of memory. Every time I run this script it eats more and more memory, until it crashes. Then I can't free up the memory unless I restart the computer (even leaving Matlab doesn't do it).
The script runs find outside the loop on a single core. Neve any issue that way. I see others have recently run into something similar. Is there a known issue with the Parallel computing toolbox? Am I doing something wrong?
Thanks!
Chris
0 Commenti
Risposte (1)
Jason Ross
il 23 Ago 2011
When you kill the job, are you using destroy?
2 Commenti
Jason Ross
il 23 Ago 2011
What's actually eating memory? If you look at Task Manager you'll see a few different MATLABs running. Do they continue to grow?
What version are you running?
Vedere anche
Categorie
Scopri di più su Communications Toolbox in Help Center e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!