Azzera filtri
Azzera filtri

Changing large matrices by not completely loading them into memory

4 visualizzazioni (ultimi 30 giorni)
Hi,
I'm attempting to modify very large matrices (single, 50e3 x 50e3), which don't make sense to load into the memory. I was wondering what you could recommend me as a data handling strategy? I thought ideally I could always load a let us say 100x100 square modify it and write it back. My working machine uses a SSD connected via M2 so it should be relatively speedy (however of course not nearly as fast as RAM). What suggestions do you have?
Thanks,
Moritz

Risposte (2)

Stephen23
Stephen23 il 18 Giu 2015
Modificato: Stephen23 il 18 Giu 2015
You should read TMW's own advice on working with big data:
And in particular you might find memmapfile to be of significant interest to you:
  1 Commento
Walter Roberson
Walter Roberson il 18 Giu 2015
Or instead of memmapfile, save the .mat with -v7.3 and then use matFile objects to read in portions of the array.

Accedi per commentare.


Alessandro
Alessandro il 18 Giu 2015
Did you check the sparse command out?
  1 Commento
Moritz
Moritz il 18 Giu 2015
Yes I did. However, I believe this only works if you have a considerable amount of zero elements. In my case however, the amount of zero elements are < 5%.

Accedi per commentare.

Categorie

Scopri di più su Large Files and Big Data in Help Center e File Exchange

Prodotti

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by