Increase the speed of the program

5 visualizzazioni (ultimi 30 giorni)
Pierre Antoine
Pierre Antoine il 6 Mag 2011
I'm working on a big data file and I have to extract some lines to create a matrix. I have to do this alot of times. In the following program, I do it 6 times and it takes 10s. The problem is that I have to do it 3000 times and maybe more!
Here is the code:
clear all;
close all;
clc;
filename = 'blg2zyx.htr';
NumSegments = dlmread(filename,'','B7..B7');
NumFrames = dlmread(filename,'','B8..B8');
k=0;
Data_Base_position = dlmread(filename,'',strcat('B',int2str(NumSegments + 20),'..H',int2str(NumSegments*2-1 + 20)));
m=0;
j=0;
t=1;
data = zeros(6*NumSegments,7)
for j=0:1:5
for b = 0:3002:3002*NumSegments-1
data(t,:) = dlmread(filename,'',strcat('B',int2str(NumSegments*2 + 20+4+b+j),'..H',int2str(NumSegments*2 + 20+4+b+j)));
t=t+1;
end
end
data = vertcat(Data_Base_position , data)
fclose('all');
Can someone tell me why is it slow, and how to increase the speed?
  1 Commento
Oleg Komarov
Oleg Komarov il 6 Mag 2011
Format the code with the '{}code' button. If you show us how is the file and what are you trying to accomplish then we can think of a bulk import -> reshape process.

Accedi per commentare.

Risposte (6)

Knut
Knut il 6 Mag 2011
Whenever I see a "for" statement in MATLAB, I question myself if that operation can be vectorized. Is it possible to use dlmread in such a way that it will generate your matrix directly, instead of writing elements one at a time?
Also, if you do:
profile on
some code
profile report
You will get some profiling info that can help tell you what parts of the code is consuming many cycles

Jarrod Rivituso
Jarrod Rivituso il 6 Mag 2011
I may be missing something, but a general thought...
Try replacing all those calls to dlmread with the textscan function.
I think dlmread uses fopen + textscan + fclose each time you call it, and of course does some error checking in there.
You could call fopen once, and then use textscan and frewind in your for loop as necessary, and then call fclose once afterwards.
Also, depending on your input file format, you may be able to have textscan go through the file piece by piece without rewinding at all.
Just a thought. Hope this helps!

Pierre Antoine
Pierre Antoine il 6 Mag 2011
Ok thanks. I've used the profile on...profile report. It's really usefull and I've seen that it's the dlmread function which takes all the time. But I can't use the textscan function because the dlmread function use the textscan function. So I don't know what to do.
Is there an other function better than these two ones, or the structure of my program can be better??
Thank you for your answers.
Pierre Antoine
  1 Commento
Jarrod Rivituso
Jarrod Rivituso il 6 Mag 2011
could you please format the code? i know it seems silly but it makes it way easier for everyone to read :)
also, i really agree with Oleg's main comment. currently it looks like you are doing lots of little file import tasks, and it would probably be much more efficient to load everything in at once and then manipulate it from there

Accedi per commentare.


Oleg Komarov
Oleg Komarov il 6 Mag 2011
You're importing the following positions:
...
'B3086..H3086'
'B6088..H6088'
'B9090..H9090'
...
You can do the same by importing everything at once and selecting with logical indexing, or can't you (memory problems, other...)?
It will be much fuster and don't underestimate the power of textscan.
EDIT
You can check memory consumption with:
memory
on win platforms, or use an external memory performance tracker.

Pierre Antoine
Pierre Antoine il 6 Mag 2011
Yes, it looks like a good solution. It makes 100s to import the file (6Mo) and I have a table (81000 * 8). So now, I can manipulate my data without use textscan or dlmread. Maybe it's the good solution. Thank you for your help. I'll check if it works well on monday.

Pierre Antoine
Pierre Antoine il 12 Mag 2011
Now I can say that it is really the good solution!! Thank you very much. I just want to know want is the limit of memory before the overload. I mean, can I load a big file? and what is the biggest matrix I can manipulate into matlab?
  1 Commento
Sean de Wolski
Sean de Wolski il 12 Mag 2011
32bit or 64bit? It's based on how much memory your computer has and whether you're on a 32bit or 64bit OS. I frequently use multiple 9GB matrices. on my 64bit OS.

Accedi per commentare.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by