Griddata Memory Usage for small arrays

1 visualizzazione (ultimi 30 giorni)
I am using vq = griddata(x,y,z,v,xq,yq,zq) to interpolate scattered data onto a rectangular grid (note: sample values are complex). Typical array sizes for x,y, etc. are on the order of 500 x 800. Nearest neighbor and v4 work, but take close to 10 minutes to run. While waiting, I notice that my memory usage exceeds 26 GB (32GB installed) and CPU / disk usage is pegged. The line of code is a part of an app designer app, and I have tried moving it to a separate function just in case being inside a Method was making it sluggish, but it did not affect the speed.
What can I do to troubleshoot this issue?
  1 Commento
Walter Roberson
Walter Roberson il 1 Giu 2022
Have you experimented with scatteredInterpolant? What is the size of the query space?

Accedi per commentare.

Risposta accettata

Prateekshya
Prateekshya il 7 Set 2023
Modificato: Prateekshya il 7 Set 2023
Hi Rod,
As per my understanding you are trying to optimize the speed for your code which uses "griddata" function. When dealing with large data sets and experiencing slow performance and high memory usage during the "griddata" interpolation, assuming that you have already checked the data size and complexity as well as considered reducing the data size, there are several steps you can take to troubleshoot and optimize the process:
  • The "griddata" function supports different interpolation methods, such as "linear", "cubic", "natural" and "nearest". Try different methods to see if any provide better performance for your specific data set. For example:
vq = griddata(x,y,z,v,xq,yq,zq,"nearest");
  • If you have access to the Parallel Computing Toolbox and you are using for loop, you can parallelize the "griddata" interpolation using "parfor" or "spmd" constructs. Please refer to the following documentation for knowing more about these constructs:
  • Use MATLAB's profiling tools, such as the Profiler or the "profile" command, to identify the parts of your code that are taking the most time and consuming the most memory. For more information please go through: https://in.mathworks.com/help/matlab/matlab_prog/profiling-for-improving-performance.html
  • If your output grid does not require high precision, you can reduce the resolution of the output grid to speed up the interpolation. This can be done by reducing the number of grid points in the "xq", "yq", and "zq" arrays.
Hope this helps!
  3 Commenti
Bruno Luong
Bruno Luong il 7 Set 2023
Divide the query points in chunks and use parfor on chunk?
It's not ideal since the shared preprocessing part of the algorithm must be redo for every chunk, contrary to scatteredInterpolant.
Rod Lopez
Rod Lopez il 7 Set 2023
I ended up changing my code entirely, getting away from App Designer and utilizing a series of two for-loops with interp1. This ended up reducing run times significantly; now it takes about 5 seconds for my largest data sets. Using griddata was a lot cleaner but I was unable to get it or scatteredinterpolant to work for me. On a side note, I did end up using parfor in a separate function that needs to call the function being discussed thousands of times rapidly and it has paid off, reducing run times from minutes down to ~30 seconds for the most complex runs.

Accedi per commentare.

Più risposte (0)

Categorie

Scopri di più su Parallel Computing Fundamentals in Help Center e File Exchange

Prodotti


Release

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by