Mean of selected range of a matrix based on a range of values from another matrix
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
Hello everyone,
I have a mat file (attached) containing 4 parameters: month, sa, ta, and sig. My intention is to get the mean and standard deviation from each month of sa and ta at a specific range of sig (let's say the value of sig : 27.4 - 27.5).
So, the intended output should be like this:
Thank you!
2 Commenti
Shivam Gothi
il 10 Ott 2024
What I understand is, you want to find the mean and the standard deviation of only those set of values of "ta" and "sa" for which "sig" is within the range of 27.4 - 27.5. Also, the "sig_range" is different for different months.
Is my understanding of question corrrect ?
Risposta accettata
Voss
il 10 Ott 2024
load('data_my.mat')
T = table(month,sa,ta,sig);
% only sig 27.4 to 27.5
idx = sig >= 27.4 & sig < 27.5;
G = groupsummary(T(idx,:),'month',{'mean','std'},{'sa','ta'})
% for reference, all sig
G = groupsummary(T,'month',{'mean','std'},{'sa','ta'})
4 Commenti
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Numeric Types in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!