Azzera filtri
Azzera filtri

Logging parellel pool worker diaries to single file

32 visualizzazioni (ultimi 30 giorni)
I'm converting a procedural script into a parallel script. The existing script dumps the command window output using Diary. So each job's output was nicely lumped together. Now though, as I have tasks running in parallel my log file is very disjointed. Not ideal.
Is there a way I can collect each worker's diary and send it to a dataQueue for storage in a single file? In pseudo code it would be:
parfor ...
run task (including command window outputs)
send Diary content back to client dataPool for writing to log file
end
Is there some method within my parfor loop of saying 'send(q, thisWorker.Diary)'?
  2 Commenti
Ben Wetherill
Ben Wetherill il 31 Gen 2018
Modificato: Ben Wetherill il 31 Gen 2018
While cycling home I thought of a possible solution: each worker just writes their own diary to a random filename report and passes the filename and the job name back to the dataQueue. The dataQueue can then use this to append the report content to the relevant overall log, deleting the temporary files as it goes.
Ben Wetherill
Ben Wetherill il 1 Feb 2018
Modificato: Ben Wetherill il 1 Feb 2018
Tried my idea above (see code below). It didn't work. The individual temp log files remained blank. So it seems that for parfor you can't get access to each workers command wondow output. So I'm getting all the workers log reports intermixed together in the main client window. Not helpful. :( Will try Edric's idea below.
parfor i = casesToRun
% setup worker diary
tempLogfile = [tempname '.log'];
diary(tempLogfile);
doStuff (...);
% Send temp diary file to logQueue
logData = {tempLogfile, logFileName};
send(logQueue, logData);
end

Accedi per commentare.

Risposta accettata

Edric Ellis
Edric Ellis il 1 Feb 2018
Instead of parfor, you could use parfeval to run stuff on the workers. This will require a bit of restructuring - you'll need to divide up your work into decent-sized chunks (this is something that parfor takes care of for you automatically). Then, you can use the Diary property of each parallel.Future instance.
For example
% request execution on a worker
f = parfeval(@() fprintf('worker diary text: %d\n', rand()), 0);
wait(f), f.Diary
  1 Commento
Ben Wetherill
Ben Wetherill il 2 Feb 2018
Modificato: Ben Wetherill il 2 Feb 2018
For interest, below is what I did. There seems to be an issue though with some of the logs not being saved. All cases are run and I have the results, but some of the diary logs are missing. I'll ask seperate question about this issue.
diary on;
for i = casesToRun
f(i) = parfeval(p, @doStuff, 2, ...);
end
for i = casesToRun
[completedIdx, status, xlRow] = fetchNext(f);
writeWorkerLog(f(i).Diary, logFileName);
xlswrite(TestCasesFile,status,'Sheet1',xlRow);
end
diary off;

Accedi per commentare.

Più risposte (0)

Categorie

Scopri di più su Parallel for-Loops (parfor) in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by