Solving pde using GPU
10 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hello
I am currently solve numerically a non linear time dependent PDE using Euler method. When I run my code with CPU everything seems to work fine however when I use GPU (transforming all my variables using gpuarray ) the solution changes significantly and in some point "explodes" (I compared the solution between CPU and GPU in the same time and they were totally different ) . Did anyone see something like that before? thank you
7 Commenti
Joss Knight
il 14 Mar 2016
That bug workaround is for the FFT, not for FFT2. And it only does anything for vector inputs of certain lengths, so it will literally be doing nothing in your case. All you've done is make the FFT2 implementation more like the CPU one - so less efficient, but the results will be closer.
Looks like all we're talking about here is numerical accuracy. You haven't actually shown what it is you are iterating on, but it's a fair assumption that you are continually calling ifft2(fft2(X))? This will inevitably have the effect of causing slight numerical offsets to grow, which is why you need to insert real(X) into the loop to remove the extraneous imaginary part.
I can't exactly explain how the CPU's version of the FFT can perfectly reflect the result of FFT with the IFFT and end up with something with a zero imaginary part - no doubt it just falls out in the equations. However, the GPU FFT is computed in parallel and won't be able to provide the same perfect mirroring properties.
In short, it is perfectly valid for you to remove the imaginary part when you know the result is supposed to be real.
Risposte (0)
Vedere anche
Categorie
Scopri di più su Transforms in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!