Main Content


Classification edge


E = edge(ens,tbl,ResponseVarName)
E = edge(ens,tbl,Y)
E = edge(ens,X,Y)
E = edge(___,Name,Value)


E = edge(ens,tbl,ResponseVarName) returns the classification edge for ens with data tbl and classification tbl.ResponseVarName.

E = edge(ens,tbl,Y) returns the classification edge for ens with data tbl and classification Y.

E = edge(ens,X,Y) returns the classification edge for ens with data X and classification Y.

E = edge(___,Name,Value) computes the edge with additional options specified by one or more Name,Value pair arguments, using any of the previous syntaxes.


If the predictor data X or the predictor variables in tbl contain any missing values, the edge function can return NaN. For more details, see edge can return NaN for predictor data with missing values.

Input Arguments


A classification ensemble constructed with fitcensemble, or a compact classification ensemble constructed with compact.


Sample data, specified as a table. Each row of tbl corresponds to one observation, and each column corresponds to one predictor variable. tbl must contain all of the predictors used to train the model. Multicolumn variables and cell arrays other than cell arrays of character vectors are not allowed.

If you trained ens using sample data contained in a table, then the input data for this method must also be in a table.


Response variable name, specified as the name of a variable in tbl.

You must specify ResponseVarName as a character vector or string scalar. For example, if the response variable Y is stored as tbl.Y, then specify it as 'Y'. Otherwise, the software treats all columns of tbl, including Y, as predictors when training the model.


A matrix where each row represents an observation, and each column represents a predictor. The number of columns in X must equal the number of predictors in ens.

If you trained ens using sample data contained in a matrix, then the input data for this method must also be in a matrix.


Class labels of observations in tbl or X. Y should be of the same type as the classification used to train ens, and its number of elements should equal the number of rows of tbl or X.

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.


Indices of weak learners in the ensemble ranging from 1 to ens.NumTrained. edge uses only these learners for calculating loss.

Default: 1:NumTrained


Meaning of the output E:

  • 'ensemble'E is a scalar value, the edge for the entire ensemble.

  • 'individual'E is a vector with one element per trained learner.

  • 'cumulative'E is a vector in which element J is obtained by using learners 1:J from the input list of learners.

Default: 'ensemble'


A logical matrix of size N-by-T, where:

  • N is the number of rows of X.

  • T is the number of weak learners in ens.

When UseObsForLearner(i,j) is true, learner j is used in predicting the class of row i of X.

Default: true(N,T)


Indication to perform inference in parallel, specified as false (compute serially) or true (compute in parallel). Parallel computation requires Parallel Computing Toolbox™. Parallel inference can be faster than serial inference, especially for large datasets. Parallel computation is supported only for tree learners.

Default: false


Observation weights, a numeric vector of length size(X,1). If you supply weights, edge computes weighted classification edge.

Default: ones(size(X,1),1)

Output Arguments


The classification edge, a vector or scalar depending on the setting of the mode name-value pair. Classification edge is weighted average classification margin.


expand all

Find the classification edge for some of the data used to train a boosted ensemble classifier.

Load the ionosphere data set.

load ionosphere

Train an ensemble of 100 boosted classification trees using AdaBoostM1.

t = templateTree('MaxNumSplits',1); % Weak learner template tree object
ens = fitcensemble(X,Y,'Method','AdaBoostM1','Learners',t);

Find the classification edge for the last few rows.

E = edge(ens,X(end-10:end,:),Y(end-10:end))
E = 8.3310

More About

expand all

Extended Capabilities

Version History

expand all

See Also