Limiting MATLAB memory usage
188 views (last 30 days)
I am running MATLAB on 64-bit Win7. When I type "memory" I get:
Maximum possible array: 19851 MB (2.082e+010 bytes) *
Memory available for all arrays: 19851 MB (2.082e+010 bytes) *
Memory used by MATLAB: 535 MB (5.613e+008 bytes)
Physical Memory (RAM): 12286 MB (1.288e+010 bytes)
* Limited by System Memory (physical + swap file) available.
As you can see, this suggests MATLAB is given ~20 GB to create arrays while I have only ~12 GB of memory.
What happens is that in some cases while developing code, MATLAB tries to create arrays that are larger than 12 GB. This usually just freezes my system and is a big annoyance.
I would much rather prefer MATLAB just give me an "Out of memory" error if it exceeds like 4 or 8 GB of memory usage or something. I haven't been able to figure out how to do this. Is there a way to limit how much memory MATLAB can use? All the searches I do just seem to be answers about how to avoid the "Out of memory" errors...
More Answers (5)
Jason Ross on 9 Apr 2012
You could set the swap file size on the machine to a specified (small) amount. This will prevent Windows trying to be "helpful" by creating a larger swap file. I'm guessing that your system has the default "Let system manage page file size" default set, which will keep growing the file as needed. In your case, it sounds like you would prefer that it stop doing this sooner rather than later.
Daniel Shub on 9 Apr 2012
I don't think you can do this the way you want. It is not the size of the arrays that cause your system to freeze, but rather the use of swap file space (virtual memory). Even if you limit MATLAB to only use X amount of memory, I don't think you can limit it to physical memory. The simple solution is to eliminate your swap file. On a single user Windows system with 12 GB of memory you should never need swap.
Sam Walder on 21 Oct 2015
Edited: Sam Walder on 21 Oct 2015
I was having that same issue and have come up with one relatively simple solution. This solves the problem for me, but is more treating symptoms than a cure.
As the problem seemed to be that MATLAB would try and create an array that was bigger than the physical memory available (which it can, but we would rather it did not) and not causing any error or warning we can simply wrap out function up so that it does this check for us.
The attached file (Available on the file exchange here: File Exchange ) may be useful. It simply checks the physical memory available before allowing MATLAB to give it a go.
This will not stop you from breaking it and does not actually look for contiguous memory, but it is a lot safer.
It is not the prettiest way of solving the problem.... but it has saved me some time from my computer not crashing as often.
Malcolm Lidierth on 9 Apr 2012
If you are happy for the array to contain garbage to start with, one solution is to memory map to a pre-existing file and skip the stage that sets all elements to zero, one, or NaN.
The "vvar" class at http://www.mathworks.com/matlabcentral/fileexchange/34276-vvar-class-a-fast-virtual-variable-class-for-matlab does that and is ~100x faster than zeros for creating an 8Gb virtual array.