Memory issues in using xlsread and dataset('XLSFile'...)
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
I have a huge file in XLSX format. While xlsread runs without memory issues, the command A = dataset(XLSFile, filename) throws the error "Error: Not enough storage is available to complete". Why would this happen ? Any broad guidelines to keep in mind while handling large data will be very helpful.
Thank You
1 Commento
Peter Perkins
il 8 Mar 2011
Vandhana, it would help to know more specifically what your data are, and the error trace that gets spit out to the command window, if there is one.
Risposte (3)
Walter Roberson
il 6 Mar 2011
I speculate that dataset() reads the data and then converts it in to a different organization, requiring intermediate storage equivalent to the size of the data.
I do not know this to be true, as I do not have that toolbox, but from a code point of view it is more economical to only write the xlsx reading code once and convert the output, rather than writing code to read xlsx for every routine that is going to convert the data, just to prevent potential problems with intermediate storage being exceeded.
0 Commenti
Matt Tearle
il 7 Mar 2011
Do you have a simple way to write the file out in a different format (eg text, such as CSV)? What are the columns of your file? (ie how many and what type)
If you can work with text, you might be able to batch process and/or use non-double types to reduce the memory requirements.
0 Commenti
Nick Haddad
il 3 Ott 2014
This issue is a known bug in MATLAB and has been addressed in the following bug report:
The bug report has a workaround which you can install for MATLAB R2013a through R2014b.
0 Commenti
Vedere anche
Categorie
Scopri di più su Spreadsheets in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!