Contenuto principale

CompactClassificationXGBoost

Compact classification XGBoost model

Since R2026a

    Description

    Pretrained XGBoost model for binary or multiclass classification. Use the compact classification XGBoost model for making predictions (classifications) of new data.

    Creation

    Create a CompactClassificationXGBoost object by importing a pretrained binary or multiclass classification XGBoost model using importModelFromXGBoost.

    Properties

    expand all

    This property is read-only.

    List of the elements in Y with duplicates removed, returned as logical vector or numeric vector. ClassNames has the same data type as the data in the argument Y.

    Data Types: double | logical | cell

    This property is read-only.

    Misclassification costs, returned as a square numeric matrix. Cost has K rows and columns, where K is the number of classes.

    Cost(i,j) is the cost of classifying a point into class j if its true class is i. For XGBoost models, Cost(i,j) is always 1. The order of the rows and columns of Cost corresponds to the order of the classes in ClassNames.

    Data Types: double

    This property is read-only.

    Expanded predictor names, returned as a cell array of character vectors.

    For XGBoost models, ExpandedPredictorNames is the same as PredictorNames.

    Data Types: cell

    This property is read-only.

    Parameters of imported model, returned as a structure with 7 fields. This table lists the ImportedModelParameters fields and their values.

    Field NameValue
    BaseScoreInitial prediction score of all samples
    ObjectiveLearning objective
    BoosterType of weak learner used in gradient boosting. Only 'Tree' is supported.
    NumBoostingRoundsNumber of boosting rounds where each round involves fitting a new decision tree to the residuals
    HasParallelTrees1 if imported model has parallel trees, 0 otherwise
    IsBinary1 if imported model has a binary objective, 0 otherwise
    NumClassesNumber of classes in the imported model

    This property is read-only.

    The number of classes in the imported model, returned as a positive integer.

    Data Types: double

    This property is read-only.

    Number of trained weak learners in the ensemble, returned as a positive integer.

    Data Types: double

    This property is read-only.

    Predictor names, specified as a cell array of character vectors. The order of the entries in PredictorNames is the same as in the training data.

    Data Types: cell

    This property is read-only.

    Prior probabilities for each class, returned as a K-element numeric vector, where K is the number of unique classes in the response. The order of the elements of Prior corresponds to the order of the classes in ClassNames and is 1/K for XGBoost models.

    Data Types: double

    This property is read-only.

    Name of the response variable, returned as 'Y'.

    Data Types: char

    This property is read-only.

    Function for transforming scores, specified as "logit" for binary classification or "softmax" for multiclass classification.

    Data Types: char | string

    This property is read-only.

    Trained weak learners, returned as a cell array. The entries of the cell array contain the corresponding compact models as CompactRegressionTree objects.

    Data Types: cell

    This property is read-only.

    Trained weak learner weights, returned as a numeric vector. TrainedWeights has NumTrained elements, where NumTrained is the number of weak learners in the ensemble. The ensemble computes the predicted response by aggregating weighted predictions from its learners. For XGBoost models, the TrainedWeights are a vector of ones, signifying each learner has equal weight.

    Data Types: double

    Object Functions

    compareHoldoutCompare accuracies of two classification models using new data
    edgeClassification edge for XGBoost classification model
    gatherGather properties of Statistics and Machine Learning Toolbox object from GPU
    limeLocal interpretable model-agnostic explanations (LIME)
    lossClassification error for XGBoost model
    marginClassification margins for XGBoost classification model
    partialDependenceCompute partial dependence
    plotPartialDependenceCreate partial dependence plot (PDP) and individual conditional expectation (ICE) plots
    predictPredict labels using classification XGBoost model
    predictorImportanceEstimates of predictor importance for XGBoost model
    shapleyShapley values

    Examples

    collapse all

    Import a pretrained XGBoost classification model trained using the ionosphere dataset. The pretrained model is provided with this example.

    The model was trained in Python and saved as a json file using model.save_model('trainedXGBoostModel.json').

    load ionosphere
    modelfile = "trainedXGBoostModel.json";
    Mdl = importModelFromXGBoost(modelfile)
    Mdl = 
      CompactClassificationXGBoost
                   ResponseName: 'Y'
                     ClassNames: [0 1]
                 ScoreTransform: 'logit'
                     NumTrained: 30
        ImportedModelParameters: [1×1 struct]
    
    
      Properties, Methods
    
    

    The model is imported as a CompactClassificationXGBoost model object.

    Use the dot notation to view the imported model parameters.

    Mdl.ImportedModelParameters
    ans = struct with fields:
                BaseScore: 0.6374
                Objective: 'binary:logistic'
                  Booster: 'Tree'
        NumBoostingRounds: 30
         HasParallelTrees: 1
                 IsBinary: 1
               NumClasses: 2
    
    

    The parameters indicate it is a binary classification model trained using the 'Tree' booster with 'binary:logistic' as the objective function.

    View one of the internal trees in the Mdl.Trained cell vector

    view(Mdl.Trained{5})
    Decision tree for regression
     1  if x5<0.04198 then node 2 elseif x5>=0.04198 then node 3 else 0.04198
     2  fit = -0.0744494
     3  if x27<1 then node 4 elseif x27>=1 then node 5 else 1
     4  if x3<0.4375 then node 6 elseif x3>=0.4375 then node 7 else 0.4375
     5  if x8<0.00197 then node 8 elseif x8>=0.00197 then node 9 else 0.00197
     6  if x20<0.05147 then node 10 elseif x20>=0.05147 then node 11 else 0.05147
     7  if x10<-0.35818 then node 12 elseif x10>=-0.35818 then node 13 else -0.35818
     8  if x16<0.00286 then node 14 elseif x16>=0.00286 then node 15 else 0.00286
     9  if x8<0.71596 then node 16 elseif x8>=0.71596 then node 17 else 0.71596
    10  if x23<0.29495 then node 18 elseif x23>=0.29495 then node 19 else 0.29495
    11  fit = -0.0396349
    12  fit = -0.00421697
    13  if x16<-0.57486 then node 20 elseif x16>=-0.57486 then node 21 else -0.57486
    14  fit = -0.0630237
    15  fit = -0.0258257
    16  fit = 0.029989
    17  fit = -0.0466195
    18  fit = 0.032412
    19  fit = -0.00369187
    20  fit = -0.00245641
    21  fit = 0.0459532
    

    Tips

    For CompactClassificationXGBoost, the Trained property of Mdl stores a Mdl.ImportedModelParamaters.NumClasses-by-Mdl.ImportedModelParameters.NumBoostingRounds cell array of compact regression tree models. For a textual or graphical display of tree x, y in the cell array, enter view(Mdl.Trained{x,y})

    Extended Capabilities

    expand all

    Version History

    Introduced in R2026a