Strategy / process for working with large (300 x 300 x 10,000) matrices that aren't fitting in memory
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
Hey MATLAB community - could use your help understanding how to efficiently work with multiple large matrices.
Shape of the data
I have 10 matrices that measure 300 x 300 x 10,000 stored as single precision.
Current (probably incorrect) approach
I'm storing the 10 matricies in a structure S, where each matrix is assigned to a field (e.g., S.var1 = 300 x 300 x 10,000 single precision matrix)
I need to perform various operations on these matrices that range from simple (like cropping and rotating) to complex (like calculating correlations and covariance of certain points along all the dimensions).
Needless to say, I commonly run into memory errors on my system and I've started to consider alternative approaches like: datastores, tall arrays, mapreduce, and distributed arrays.
It would be great if I could get anyone's guidance or perspective on the best strategy or process to work with these large matrices.
Thanks!
PS - for reference my system has 16GB of RAM, and ~100GB of free space on an SSD. I realize upgrading my computer may be the simplest option, but would like to avoid that if at all possible.
0 Commenti
Risposte (1)
Gaurav Garg
il 26 Feb 2021
Hi Sanford,
Along with the alternative approaches already mentioned, you can also use bigImage class and store information about a large TIFF image file and the image data it contains. Also, you can use blockproc function to process the distinct blocks of images .
0 Commenti
Vedere anche
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!