# Confidence and prediction bands using nlsqcurvefit

7 visualizzazioni (ultimi 30 giorni)
Faizan Lali il 18 Mar 2023
Commentato: Matt J il 19 Mar 2023
I am trying to fit the model first by nlinfit and then lsqcurvefit but in both cases I am getting the non significant parameter estimation for one of the parameter as its confidence intervals conatains zero. There are two limitations of my model;
1. I can not change the model or can not transform any of the data.
2. Parameters can not be less than zero.
Working on the 2nd point I used lsqcurvefit with bound limits for parameters as
lb = [0.001,0.1];
ub = [1,1.5];
but unfortunately the 2nd parameter does not change its value from lower bound. In other words while estimating by lsqcurvefit the function is always giving the value of beta2 equal to corresponding lower bound value.
Also I want to plot Confidence & prediction bands for the estimates. I did it with nlinfit but I am not sure how to do it with lsqcurvefit.
Here is my code I Don't know where I am making mistake. Data file is attached.
x1=data(:,2);
x2=data(:,3);
yobs=data(:,4);
xm=[x1 x2];
%% Initial parameter guesses
% C1=0.32;
% C2=0.27;
C1=1.31;
C2=2.1585;
beta0(1)=C1; %initial guess yo
beta0(2)=C2; %initial guess kr
p=length(beta0); %p = # parameters
%% define the function that will be used for the forward problem
fnameFOR = @Project_func;
ypred_b=fnameFOR(beta0,x1,x2);
%% lsqcurvefit returns parameters, residuals, Jacobian (sensitivity %coefficient matrix),
%how to find covariance matrix, and mean square error ??
fnameINV=@Project_funcINV;
lb = [0.001,0.1];
ub = [1,1.5];
[beta,resnorm,residual,exitflag,output,lambda,jacobian] = lsqcurvefit(fnameINV,beta0,xm,yobs,lb,ub);
beta
% Confidence intervals using nlparci
conf = nlparci(beta,residual,'jacobian',jacobian);
% Confidence interval calculated manually
lastwarn(''); %Clear warning memory
alpha = 0.05; %significance level
df = length(residual) - numel(beta); %degrees of freedom
crit = tinv(1-alpha/2,df); %critical value
covm = inv(jacobian'*jacobian) * var(residual); %covariance matrix
[~, warnId] = lastwarn; %detect if inv threw 'nearly singular' warning.
covmIdx = sub2ind(size(covm),1:size(covm,1),1:size(covm,2)); %indices of the diag of covm
CI = nan(numel(beta),2);
if ~strcmp(warnId, 'MATLAB:nearlySingularMatrix')
CI(:,1) = beta - crit * sqrt(covm(covmIdx));
CI(:,2) = beta + crit * sqrt(covm(covmIdx));
end
%% Confidence and Prediction bands
% How to calculate CBs and PBs ?? I have included the code below which I am using
% in nlinfit estimation.
%% Confidence and prediction intervals for the dependent variable using nlinfit (This part is just for reference)
%nonlinear regression confidence intervals-- 'on' means simultaneous
%bounds; 'off' is for nonsimultaneous bounds; must use 'curve' for
%regression line, 'observation' for prediction interval
[ypred, delta] = nlpredci(fnameINV,xm,beta,resids,J,0.05,'on','curve'); %confidence band for regression line
[ypred, deltaob] =nlpredci(fnameINV,xm,beta,resids,J,0.05,'on','observation');%prediction band for individual points
yspan=range(ypred)% total span of ypred
relrmse=rmse/yspan % ratio of rmse vs. yspan
%simultaneous confidence bands for regression line
CBu=ypred+delta;
CBl=ypred-delta;
%simultaneous prediction bands for regression line
PBu=ypred+deltaob;
PBl=ypred-deltaob;
%% Functions
% Forward Function
function y=Project_func(beta0,x1,x2)
%tm=[x1 x2];
c2s=@(x)-2.40874-(39.748*(1+x).^-2.856);
y=100./(1+exp((-beta0(1).*2.*c2s(x1))+(beta0(2).*c2s(x1).*log10(100.*x2))));
end
% Inverse Function
function y=Project_funcINV(beta0,xm)
c2s=@(x)-2.40874-39.748*(1+x).^-2.856;
y=100./(1+exp(-beta0(1).*2.*c2s(xm(:,1))+(beta0(2).*c2s(xm(:,1)).*log10(100.*xm(:,2)))));
end
##### 0 CommentiMostra -2 commenti meno recentiNascondi -2 commenti meno recenti

Accedi per commentare.

### Risposta accettata

Matt J il 18 Mar 2023
Modificato: Matt J il 18 Mar 2023
Covariance methods are not valid when you have parameter bound constraints. Since it's not a time-consuming fit, I would just use some sort of Monte Carlo resampling to determine the variability of the beta estimates:
x1=data(:,2);
x2=data(:,3);
yobs=data(:,4);
xm=[x1 x2];
%% Initial parameter guesses
% C1=0.32;
% C2=0.27;
C1=1.31;
C2=2.1585;
beta0(1)=C1; %initial guess yo
beta0(2)=C2; %initial guess kr
p=length(beta0); %p = # parameters
%% lsqcurvefit returns parameters, residuals, Jacobian (sensitivity %coefficient matrix),
%how to find covariance matrix, and mean square error ??
fnameINV=@Project_funcINV;
lb = [0.001,0.1];
ub = [1,1.5];
[beta,resnorm,residual,exitflag] = lsqcurvefit(fnameINV,beta0,xm,yobs,lb,ub);
Local minimum possible. lsqcurvefit stopped because the final change in the sum of squares relative to its initial value is less than the value of the function tolerance.
sig=std(residual);
ypred=fnameINV(beta,xm);
BETAS=monteCarlo(fnameINV,beta,xm,ypred,lb,ub,sig); %Monte Carlo samples
std(BETAS)
ans = 1×2
0.0619 0.0820
% Inverse Function
function y=Project_funcINV(beta0,xm)
c2s=@(x)-2.40874-39.748*(1+x).^-2.856;
y=100./(1+exp(-beta0(1).*2.*c2s(xm(:,1))+(beta0(2).*c2s(xm(:,1)).*log10(100.*xm(:,2)))));
end
function BETAS=monteCarlo(fnameINV,beta,xm,ypred,lb,ub,sig)
N=100; %number of random samples
BETAS=nan(N,numel(beta));
opts=optimoptions('lsqcurvefit','Display','none');
for i=1:N
yobs=ypred+sig*randn(size(ypred));
BETAS(i,:) = lsqcurvefit(fnameINV,beta,xm,yobs,lb,ub,opts);
end
end
##### 12 CommentiMostra 10 commenti meno recentiNascondi 10 commenti meno recenti
Faizan Lali il 19 Mar 2023
No.. What do you mean by ground truth?
Matt J il 19 Mar 2023
Calculating rmse and relerror require that you know the true value of beta.

Accedi per commentare.

### Categorie

Scopri di più su Linear Predictive Coding in Help Center e File Exchange

R2022a

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by