Azzera filtri
Azzera filtri

Maximum Array size limit

281 visualizzazioni (ultimi 30 giorni)
lucksBi
lucksBi il 16 Mar 2017
Risposto: Muniba Ashfaq il 11 Set 2020
hi i am using R2016a and geting this error Requested 95319x84602 (60.1GB) array exceeds maximum array size preference. Creation of arrays greater than this limit may take a long time and cause MATLAB to become unresponsive. See array size limit or preference panel for more information.
I have changed maximum array size in prefrences to 10000 (Maximum allowed size) Any solution for this? Thanks in advance
  2 Commenti
KSSV
KSSV il 16 Mar 2017
What is the necessity to load such a huge data at once?
lucksBi
lucksBi il 16 Mar 2017
I am working on dataset of size 841372x3 and need to perform some calculations on it

Accedi per commentare.

Risposte (4)

Adam
Adam il 16 Mar 2017
Any solution for what? The fact that you want a 60 GB array in memory? Do you even have 60 GB of RAM? What do you want to do with the array? Even if you could find enough memory for it any calculations done on it would likely require still more memory.
doc matfile
can be used to create arrays too large to fit into memory though if you are sure you really need an array this big.

Jan
Jan il 16 Mar 2017
Modificato: Jan il 16 Mar 2017
You set the preferences to 10000 what? MB or GB? As far as I remember the size is set to the percentage of available RAM, so I'd expect something like 100%, but not "10'000".
Concerning your comment: 841372x3 means a 20MB array, 95319x84602 is a 64GB array. This is a remarkable difference.
Do you have 64GB of free RAM, or better the double size, which is required when you want to work with such arrays?
Please read the documentation concerning tall arrays.
  1 Commento
Royi Avital
Royi Avital il 30 Mar 2017
Is there a command to set it (Like with groot or something)?

Accedi per commentare.


eun joo yoon
eun joo yoon il 6 Feb 2020
I have similar problem.
I want to deal with 500x500m scale global data in Matlab. (80150*34036 array)
I made tif to ascii in Arcmap.
But it is not open because of lack of memory,,
I wonder how other people deal with such global data such as MODIS data.
  2 Commenti
Rik
Rik il 6 Feb 2020
Have a read here and here. It will greatly improve your chances of getting an answer.
I'm unsure where to move your comment, so I will give you the opportunity to post it in the appropriate comment section or use it in your separate question. This answer will be deleted later today.
Steven Lord
Steven Lord il 6 Feb 2020
If you want to work with Big Data (data that's larger than you can store in memory) consider using one or more of the tools listed on this documentation page. For ASCII data that large, setting up a datastore for the file or files and using that datastore to create a tall array would probably be my first attempt.
Rik, while this question doesn't seem directly related to the original question it's at least in the same general space. I'm not sure it should be deleted.

Accedi per commentare.


Muniba Ashfaq
Muniba Ashfaq il 11 Set 2020
I was also getting this error. I deleted all unnecessary variables from workspace to free up memory. I just keep those variables that was needed for my MATLAB code execution.
This worked amazingly.

Tag

Non è stata ancora inserito alcun tag.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by