textscan: Instantaneous out of memory error when accessing very large file (only with newest Matlab versions)
8 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Simon Stehle
il 7 Mag 2021
Risposto: Steven Lord
il 13 Mag 2021
I am working with a very large dataset (total 500GB) that is split up into more than a thousand individual .txt files (160 columns/characteristics per file, more than a million rows possible, contains a mixture of string and numeric variables), each covering a specific geo area. For files covering large areas, a single .txt file can be as large as 16GB. To cope with the large amount of data, I proceed as follows for each of the files:
- access the respective .txt file (with "fopen")
- within a while-loop import 250,000 rows using "textscan"
- process data and export smaller dataset (append if not first loop iteration)
- repeat steps above until end of the .txt file is reached (while ~feof)
The code that imports the data looks like this:
fileID = fopen(filename) ; % Identify file name, "filename" is the path of the current .txt file
while ~feof(fileID) % import stacks of 250,000 rows until the end of the current .txt file is reached
% Import data, "format varlist" identifies format and columns to
% import. Delimiter is "|".
Data= textscan(fileID,strcat(char(format_varlist),'\r\n'),250000,'Delimiter','|',...
'HeaderLines', double(first_iteration==1),'EndOfLine','\r\n','EmptyValue',-1) ;
end
Doing so allows me to effectively reduce the size of my dataset such that I can conveniently work with the full dataset later.
My problem is the following: The code works perfectly well for ALL files (including the very large .txt ones) with version 2019a. With version 2021a (if I recall correctly, it did not work with 2020a either), the code works perfectly well UNTIL the code reaches a file that is too large. At this point, the code (instantaneously!) stops with an "out of memory" error:
Out of memory.
Related documentation
I suspect that the newer "textscan" function recognizes the filesize that is to be accessed would be too large to load in fully (which it is), but does not recognize that I only want 250k lines at a time.
I looked at the "readtable" command, but as far as I know, this command does not allow to import smaller stacks of data once at a time (only for spreadsheets).
Is there a workaround/fix for my issue? As I work (and worked) frequenty with these types of codes, I would be eternally stuck with the 2019 version. Thank you very much in advance for your help.
3 Commenti
dpb
il 7 Mag 2021
And, textscan is fully builtin so not even the preliminaries are able to be looked at to see what it might do in that regards.
If it indeed won't work at all, I'd say that qualifies as a bug and is in total violation of documented behavior.
Risposta accettata
Walter Roberson
il 12 Mag 2021
specify the encoding on your fopen() so that the i/o library does not have to read through the entire file to ddetermine the encoding. The default now is to automatically detect but that can require reading the entire file to disprove the hypothesis that the file might contain utf8.
6 Commenti
Walter Roberson
il 12 Mag 2021
R2020a rather than R2019b.
File Encoding: Save MATLAB code files (.m) and other plain text files as UTF-8 encoded files by default
[...] When opening existing files, the Editor and other functions like type or fopen automatically determine the current encoding.
dpb
il 12 Mag 2021
Modificato: dpb
il 13 Mag 2021
Huh. But it never even hints that this feature can cause an out-of-memory error in any of the documentation. That should definitely be highlighted along with the above description and fopen ought to be able to report the problem specifically as to the cause and fix instead of just dumping the "out-of-memory" standard error message.
Più risposte (2)
Shiva Kalyan Diwakaruni
il 12 Mag 2021
Hi,
Refer to Memory Usage information located at the following URL:
Specifically to the sections:
1. Strategies for Efficient Use of Memory
2. Resolving "Out of Memory" Errors
Concepts:
1. Memory Allocation
2. Memory Management Functions
Some additional resources for resolving "Out of Memory" errors:
Hope it helps.
0 Commenti
Steven Lord
il 13 Mag 2021
I recommend you look at the functionality in MATLAB to process large files and big data. The approach you've described sounds like you could use a tall array backed by a tabularTextDatastore.
0 Commenti
Vedere anche
Categorie
Scopri di più su Large Files and Big Data in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!