Too many .mat files when using Write Tall Table

2 visualizzazioni (ultimi 30 giorni)
I have a reasonably large tall table (16940800x5) that was created using vertcat on about 1600 columns of a matrxi that I am stacking for better performance. However when I try to write the new stacked tall table it creats 1600 .mat files which ends up crashing the program for some reason. Is there a way to reduce the number of snapshots that are created during the write process?

Risposta accettata

Rick Amos
Rick Amos il 4 Mar 2019
As you've likely guessed, vertical concatenation of 1600 arrays results in at least 1600 files from tall/write. This is the case because tall/vertcat is conservative, it assumes all input argument are truly tall and avoids combining multiple input arguments into the same partition (and so file). I'm afraid there isn't a direct way to reduce the number of files.
I am surprised this causes the program to crash. Would it be possible to hear some more details about this?
If the order of data is not important, and you are working with R2018b, there is an alternative. You can stack data by interleaving rows with matlab.tall.transform:
tX = tall(..)
tY = matlab.tall.transform(@reshapeToWidth5, tX);
function y = reshapeToWidth5(x)
% Stack each block of 5 columns onto the first 5 columns by interleaving rows
% I.E. y = [x(1:5,1); x(6:10,1); ...; x(1:5,2); x(6:10,2); ...]
y = reshape(x', 5, [])';
end
  1 Commento
Tylor Slay
Tylor Slay il 4 Mar 2019
I mispoke when I said it crashed the program, it threw an error stating that the directory location ran out of memory. Which I didn't quite understand becuase I was working on a 4 TB hard drive so that wasn't true. I had actually thought about reshaping the tall array so thank you for your answer. I was unaware of the "matlab.tall.transform"

Accedi per commentare.

Più risposte (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by