How to average more than 50 3D matrices using nanmean

1 visualizzazione (ultimi 30 giorni)
Hi, I am trying to average a lot of 3D matrices using NaNmean. I have tried using cat but my 3D matrices are huge (351x400x400) which is using a lot of memory. Is there a better way to do this ?
  7 Commenti
Adam Danz
Adam Danz il 14 Nov 2019
Modificato: Adam Danz il 14 Nov 2019
Hmmmm... concatenating 50 arrays that each have more than 56 million elements isn't going to happen.
Off the bat I can think of a couple ideas.
1) Using 2 loops, you can loop through each file and partially load each 351 x 400 slice so you have 50 of those matricies which would make ~7m data points. If that's still too large you could partially load in each 351x1 column. Then you can do element-wise averaging and store the values as you proceed through the loops. That would involve 50 x 400 loops which isn't a big deal.
2) you can reorganize your data as tall arrays which are designed for large amounts of data.

Accedi per commentare.

Risposta accettata

Matt J
Matt J il 14 Nov 2019
Modificato: Matt J il 15 Nov 2019
Here's what I would do, I suppose. It assumes each of your .mat files stores the volume under the name 'a'.
Summation=0;
NCounter=0;
files=dir(fullfile('yourFolder','*.mat'));
for i=1:numel(files)
S=load(fullfile('yourFolder',files(i).name));
map=isnan(S.a);
S.a(map)=0;
Summation = Summation + S.a;
NCounter = NCounter + (~map);
end
result = Summation./Ncounter;
  2 Commenti
raheem mian
raheem mian il 14 Nov 2019
Modificato: raheem mian il 14 Nov 2019
I like this method too ! Thanks. Method seems faster.

Accedi per commentare.

Più risposte (0)

Categorie

Scopri di più su Creating and Concatenating Matrices in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by