Limiting MATLAB memory usage

177 visualizzazioni (ultimi 30 giorni)
Michael
Michael il 9 Apr 2012
Commentato: Anthony Barone il 26 Lug 2018
I am running MATLAB on 64-bit Win7. When I type "memory" I get:
Maximum possible array: 19851 MB (2.082e+010 bytes) *
Memory available for all arrays: 19851 MB (2.082e+010 bytes) *
Memory used by MATLAB: 535 MB (5.613e+008 bytes)
Physical Memory (RAM): 12286 MB (1.288e+010 bytes)
* Limited by System Memory (physical + swap file) available.
As you can see, this suggests MATLAB is given ~20 GB to create arrays while I have only ~12 GB of memory.
What happens is that in some cases while developing code, MATLAB tries to create arrays that are larger than 12 GB. This usually just freezes my system and is a big annoyance.
I would much rather prefer MATLAB just give me an "Out of memory" error if it exceeds like 4 or 8 GB of memory usage or something. I haven't been able to figure out how to do this. Is there a way to limit how much memory MATLAB can use? All the searches I do just seem to be answers about how to avoid the "Out of memory" errors...
Thanks.
  3 Commenti
Anthony Barone
Anthony Barone il 26 Lug 2018
I hate to resurrect an old thread, but I've wanted to do the exact same thing for some time and finally have a decent solution (on Linux anyhow, though it might be possible on windows using WSL):
Control Groups
In particular, create a control group that limits physical memory usage but not swap usage, and then launch MATLAB within that control group. This should cause MATLAB to start offloading information to swap BEFORE the system is out of memory, allowing the MATLAB code to continue without bringing the system to a screeching halt (since it still has memory available that MATLAB isnt able to use).

Accedi per commentare.

Risposta accettata

Steven Lord
Steven Lord il 21 Ott 2015
If you are using a sufficiently recent release of MATLAB (release R2015a or later) take a look at the Array Size Limit item in the Workspace And Variables section in the Preferences for MATLAB.

Più risposte (5)

Jason Ross
Jason Ross il 9 Apr 2012
You could set the swap file size on the machine to a specified (small) amount. This will prevent Windows trying to be "helpful" by creating a larger swap file. I'm guessing that your system has the default "Let system manage page file size" default set, which will keep growing the file as needed. In your case, it sounds like you would prefer that it stop doing this sooner rather than later.

Daniel Shub
Daniel Shub il 9 Apr 2012
I don't think you can do this the way you want. It is not the size of the arrays that cause your system to freeze, but rather the use of swap file space (virtual memory). Even if you limit MATLAB to only use X amount of memory, I don't think you can limit it to physical memory. The simple solution is to eliminate your swap file. On a single user Windows system with 12 GB of memory you should never need swap.

Sam Walder
Sam Walder il 21 Ott 2015
Modificato: Sam Walder il 21 Ott 2015
I was having that same issue and have come up with one relatively simple solution. This solves the problem for me, but is more treating symptoms than a cure.
As the problem seemed to be that MATLAB would try and create an array that was bigger than the physical memory available (which it can, but we would rather it did not) and not causing any error or warning we can simply wrap out function up so that it does this check for us.
The attached file (Available on the file exchange here: File Exchange ) may be useful. It simply checks the physical memory available before allowing MATLAB to give it a go.
This will not stop you from breaking it and does not actually look for contiguous memory, but it is a lot safer.
It is not the prettiest way of solving the problem.... but it has saved me some time from my computer not crashing as often.

Michael
Michael il 9 Apr 2012
Yes, I realized that was the issue, but I was hoping there was something internal to MATLAB I could change, like what triggers an "Out of memory" error.
For instance, if I type:
A = zeros(10000,10000,10000)
I immediately get an "Out of memory" error, as this is like 8 terabytes, which is what I want. There's no slow down, there's just an immediate error. So I was thinking there was a check somewhere that checked the size of what it was trying to allocate, and error out of too large.
The problem with setting the page file to something small is that it can negatively effect other things about windows (such as not recording errors properly).
Also, there probably aren't too many cases I would even want MATLAB to generate 12 GB worth of arrays, so I'd want to limit that regardless (but be able to make changes if I know it's going to be needed).
Looks like there are some ways to have Windows limit a processes memory usage, so I'll look into that...
Thanks, Mike
  6 Commenti
Michael
Michael il 9 Apr 2012
It's not a crash, really. Windows just gets bogged down with allocating and swapping and becomes nearly unresponsive. Sometimes it will eventually be recoverable, but not always, and rebooting is usually faster.
Daniel Shub
Daniel Shub il 9 Apr 2012
So how about the unhelpful answer of switch to linux or a virtual machine (either linux or windows) for your development. Reboots with a virtual machine will be much quicker.

Accedi per commentare.


Malcolm Lidierth
Malcolm Lidierth il 9 Apr 2012
If you are happy for the array to contain garbage to start with, one solution is to memory map to a pre-existing file and skip the stage that sets all elements to zero, one, or NaN.
The "vvar" class at http://www.mathworks.com/matlabcentral/fileexchange/34276-vvar-class-a-fast-virtual-variable-class-for-matlab does that and is ~100x faster than zeros for creating an 8Gb virtual array.

Categorie

Scopri di più su Startup and Shutdown in Help Center e File Exchange

Tag

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by