Negative D2 score on training data after lassoglm fit

13 visualizzazioni (ultimi 30 giorni)
T0m07
T0m07 il 17 Nov 2024
Risposto: Jaimin il 25 Nov 2024 alle 6:44
How can the deviance from a null model (i.e. betas all equal zero) be lower than the deviance from the full model? Surely lassoglm should choose betas all zero in this case?
From the code below, my d2Train is -0.0808.
[B, FitInfo] = lassoglm(table2array(indat.params.trainDataX), indat.params.trainDataY(:, minInd), 'poisson', 'Lambda', indat.combTable.bestLambdas(minInd), 'Alpha', indat.combTable.bestAlphas(minInd));
predCountsTrain = calculateRates(table2array(indat.params.trainDataX),B,FitInfo.Intercept)+eps;
predDevianceTrain = calculateDeviance(indat.params.trainDataY(:, minInd),predCountsTrain);
nullCountsTrain = calculateRates(table2array(indat.params.trainDataX),zeros(size(B)),FitInfo.Intercept)+eps;
nullDevianceTrain = calculateDeviance(indat.params.trainDataY(:, minInd),nullCountsTrain);
d2Train = 1 - (predDevianceTrain ./ nullDevianceTrain);
function rates = calculateRates(x,y,int)
rates = exp((x * y) + int);
end
function dev = calculateDeviance(observed,predicted)
scaledLogRatio = log(observed./predicted).*observed;
rawDifference = observed-predicted;
diffOfTerms = scaledLogRatio - rawDifference;
dev = nansum(diffOfTerms)*2;
end

Risposte (1)

Jaimin
Jaimin il 25 Nov 2024 alle 6:44
The negative (d^2) value indicates that the full model's deviance is unexpectedly higher than the null model's. Kindly refer to the quick checks and fixes mentioned below:
Verify Calculations: EnsurecalculateRates and calculateDeviance are correctly implemented. Use a small constant (eps) to avoid division by zero.
function rates = calculateRates(x, y, int)
rates = exp((x * y) + int);
end
function dev = calculateDeviance(observed, predicted)
% Avoid division by zero or log of zero by adding a small constant
observed = observed + eps;
predicted = predicted + eps;
% Calculate scaled log ratio and raw difference
scaledLogRatio = log(observed ./ predicted) .* observed;
rawDifference = observed - predicted;
% Deviance calculation
diffOfTerms = scaledLogRatio - rawDifference;
dev = nansum(diffOfTerms) * 2;
end
Model Overfitting: Check if the model is overfitting. Adjust lambda and alpha values in lassoglm” (https://www.mathworks.com/help/stats/lassoglm.html).
Data Issues: Inspect the data for anomalies or outliers that might affect predictions.
Poisson Assumptions: Ensure the Poisson model assumptions hold (mean ≈ variance). If not, consider alternatives like the negative binomial model.
Cross-validation: Use cross-validation to validate model performance and prevent overfitting.
By addressing these areas, you should improve model performance and resolve the deviance issue.
I hope this will be helpful.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by