out of memory error while reading binary file

Hello Everyone,
I am reading a binary file whose size is 644 MB. For reading the binary file I use the following command
[data,count] = fread(fileId, '*uint8');
When I start my MATLAB and type memory in the command window I get the following reponse.
Maximum possible array: 1150 MB (1.205e+09 bytes)
but when I read my binary file I get an out of memory error . I think it is because the fread function creates a column vector data whose size exceed the maximum possible array size.
Can someone help me how to read in a big binary file without facing the memory error. The contents of my binary file is reading from sensor being read through LAN port.

Risposte (2)

Jan
Jan il 9 Ago 2013
Modificato: Jan il 9 Ago 2013
It looks like FREAD allocates the memory in chunks with a growing buffer. This can exhaust 1 GB of free RAM easily for 640 MB of data. Try this:
Info = dir(FileName);
FID = fopen(FileName, 'r');
[data, count] = fread(fileId, [1, Info.bytes], '*uint8');
fclose(FID);
Then FREAD knows the size of the required input buffer.
But on the other hand it would be easy for FREAD to check the file size, when the dimensions are not specified. If the above method really works, send an enhancement request to TMW and ask for a proper pre-allocation in FREAD.
A general problem remains: You have only a very small amount of free RAM. Using a 64 bit version and installing further GBs is the best solution.

6 Commenti

Hello Jan,
I am trying to find the size of my file using dir command but it returns an empty structure .
Sorry Jan for the above comment i was doing some mistake while using dir command.
You can remove your comments, when you like to.
Jan there was one more thing I wanted to ask what if my file size increases to 1 GB or 1.5 GB then I cannot use the above method as my maximum permissible array size would be 1150 MB and the file size is larger than that. How can I then read the binary file, I was reading Loren log of using memmapfile for navigating through large binary file on the below link
but I am not sure how can I found the number of times I have to repeat the reading process.
When you check the available memory and get 1150 MB, this might be different tomorrow or even in a few milliseconds due to other processes running on the computer. There is no trustworthy way to guarantee the availability of a block of memory but reserving it and check for success.
When your program run out of memory, installing more RAM under a 64 bit system is the only reliable way.
You can read the file in chunks even without memory mapping: Simply call FREAD for a certain block. This might even be faster than importing the complete file at once.
Hey Jan,
the problem is still there (Matlab 2019b) with the pre-allocation problem.
Thank you for your solution. It works quiete good with fread(fileId, [1, Info.bytes], '*uint8');

Accedi per commentare.

Rodney Thomson
Rodney Thomson il 9 Ago 2013
Change your format specifier to 'uint8=>uint8'
This says: "my data is uint8 AND I want to store it in a uint8 array"
Otherwise MATLAB jus tassumes it should read a uint8 value and put it in a double. Which is 8 times the size, or 4.8GB in the case of your array!

3 Commenti

in the reading method I specify this by writing it as '*uint8'. I also tried with format specifier 'uint8=>uint8' but it still gives me the same error. I was also reading that I can use memmapfile for reading the binary file but I am confused how to use it.
*uint8 is equivalent to uint8=>uint8.
Sorry, I didn't notice the *!

Accedi per commentare.

Categorie

Richiesto:

il 9 Ago 2013

Commentato:

il 18 Apr 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by