Save "big" object to file fails
14 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hi,
I'm working on a project with OOP. There is an object called "database" containing a "big" cell-array (nested with mixed contents).
In this database, I stored some file contents. Until now, as I had about 2000 files in this database, the file could be stored properly with "save" and creates a 20 MB file. But as I added another 1000 files, the saving process stops after some time and produces a rudimentary 1KB .mat-file (no error or anything else).
I tried the "pack" command but then Matlab crashed. Of course I could post the log here if desired. I'm using Windows XP SP3, Matlab v. 7.5.0 (R2007b) and wanted to save the file on several file systems (fat/ntfs).
Is this a common issue? I couldn't find anything similar out there...
Greetings
0 Commenti
Risposte (5)
Andrea Gentilini
il 7 Mag 2012
Try go to File -> Preferences -> General -> MAT-Files -> and click the option MATLAB Verion 7.3 or later. This will allows you to save variables that are in excess of 2 GB.
Jan
il 22 Nov 2011
If Matlab crahs inside the pack command, you have a serious problem in the memory management. Do you use user-defined Mex functions?
BTW. Athough you can create a database using Matlab, dedicated database programs will do this much better.
0 Commenti
Vincent
il 24 Nov 2011
2 Commenti
Peter O
il 30 Nov 2011
Hi Vincent, I'm getting the same problem here today. Around a 300MB dataset won't save, but the 52MB version _sometimes_ will. R2011a here. I think the issue, for me, is that we have 7MB 'profile' spaces on the network for program temporary work and it's hitting the wall. I'll let you know if I find anything.
Martin Kahn
il 1 Lug 2018
Hi guys,
Given that this question still gets some views, I just had an issue that sounds very similar (with Matlab 2018a and Windows 10): When trying to save with "save('filename.mat','myFile')" I just got a 1KB file. I don't really know the details of why but this fixed it: "save('filename.mat','myFile','-v7.3')". I guess this is what Andrea suggested? Sorry if it's not helpful...
1 Commento
Riccardo Scorretti
il 23 Set 2021
Modificato: Riccardo Scorretti
il 23 Set 2021
Hi there.
Unfortunately I'm experiencing the same problem (Matlab 2020b, Linux Fedora F34). As it can be observed in the picture hereafter, as soon as serialization is triggered the amount of used memory nearly doubles:

It looks like if Matlab makes a temporary copy of data which have to be saved (with option -v 7.3 of course), and in some circumstances this ends up in an out of memory error.
In my case, I was trying to save the whole workspace, which contains many huge variables. I suggest to overcome the problem by saving separately each huge variable in different files, so as to lower the peak temporary memory usage, which is apparently required to serialize data.
Vedere anche
Categorie
Scopri di più su Database Toolbox in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!