a LogitBoost implementation
Codes for the the so-called AOSO-LogitBoost, which is an up-to-date (yet state-of-the-art, probably ) implementation of Friedman's LogitBoost for multi-class classification.
Once you decide that LogitBoost is suitable to your classification problem, just try this AOSO-LogitBoost which typically has lower classification error and faster convergence rate than original LogitBoost.
Codes are in C++, wrapped by Matlab Class with easy interfaces. Following is an example.
%-------------------------------------------------------------
%% prepare train/test data.
% 3-class classification. Features are 2 dimensional.
% 6 training examples and 3 testing examples.
Xtr = [...
0.1, 0.2;
0.2, 0.3;
0.6, 0.3;
0.7, 0.2;
0.1, 0.4;
0.2, 0.6...
];
Xtr = Xtr';
Xtr = single(Xtr);
% Xtr should be 2X6, single
Ytr = [...
0.0;
0.0;
1.0;
1.0;
2.0;
2.0;
];
Ytr = Ytr';
Ytr = single(Ytr);
% Ytr should be 1X6,single
% K = 3 classes(0,1,2)
Xte = [...
0.1, 0.2;
0.6, 0.3;
0.2, 0.6...
];
Xte = Xte';
Xte = single(Xte);
Yte = [...
0;
1;
2;
];
Yte = Yte';
Yte = single(Yte);
%% parameters
T = 2; % #iterations
v = 0.1; % shrinkage factor
J = 4; % #terminal nodes
nodesize = 1; % node size. 1 is suggested
catmask = uint8([0,0,0,0]); % all features are NOT categorical data
% Currently only numerical data are supported:)
%% train
hboost = AOSOLogitBoost(); % handle
hboost = train(hboost,...
Xtr,Ytr,...
'T', T,...
'v', v,...
'J',J,...
'node_size',nodesize,...
'var_cat_mask',catmask);
%% predict
F = predict(hboost, Xte);
% The output F now is a #classes X #test-exmaples matrix.
% F(k,j) denotes the confidence to predict the k-th class for the j-th test example.
% Just pick the maximum component of F(:,j) as your prediction for the j-th test example.
%% error and error rate
[~,yy] = max(F);
yy = yy - 1; % index should be 0-base
err_rate = sum(yy~=Yte)/length(Yte)
%-------------------------------------------------------------
Those who are interested in algorithm's details are referred to the paper:
"Peng Sun, Mark D. Reid, Jie Zhou. "AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problems", International Conference on Machine Learning (ICML 2012)"
Cita come
sun peng (2025). a LogitBoost implementation (https://it.mathworks.com/matlabcentral/fileexchange/38653-a-logitboost-implementation), MATLAB Central File Exchange. Recuperato .
Compatibilità della release di MATLAB
Compatibilità della piattaforma
Windows macOS LinuxCategorie
- AI and Statistics > Statistics and Machine Learning Toolbox > Cluster Analysis and Anomaly Detection > Nearest Neighbors >
Tag
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Scopri Live Editor
Crea script con codice, output e testo formattato in un unico documento eseguibile.
AOSOLogitBoost_/AOSOLogitBoost_/AOSOLogitBoost/
AOSOLogitBoost_/AOSOLogitBoost_/AOSOLogitBoost/mex/
| Versione | Pubblicato | Note della release | |
|---|---|---|---|
| 1.5.0.0 | A bug that might arise on x64 platform is fixed;
|
||
| 1.3.0.0 | Modification to Title/Summary/Description text to make them clear. |
||
| 1.2.0.0 | Minor modification to description text. |
||
| 1.1.0.0 | Minor modification to description text. |
||
| 1.0.0.0 |
