Main Content

functionLayer

Function layer

Since R2021b

    Description

    A function layer applies a specified function to the layer input.

    If Deep Learning Toolbox™ does not provide the layer that you need for your task, then you can define new layers by creating function layers using functionLayer. Function layers only support operations that do not require additional properties, learnable parameters, or states. For layers that require this functionality, define the layer as a custom layer. For more information, see Define Custom Deep Learning Layers.

    Creation

    Description

    example

    layer = functionLayer(fun) creates a function layer and sets the PredictFcn property.

    example

    layer = functionLayer(fun,Name=Value) sets optional properties using one or more name-value arguments. For example, functionLayer(fun,NumInputs=2,NumOutputs=3) specifies that the layer has two inputs and three outputs. You can specify multiple name-value arguments.

    Properties

    expand all

    Function

    This property is read-only.

    Function to apply to layer input, specified as a function handle.

    The specified function must have the syntax [Y1,...,YM] = fun(X1,...,XN), where the inputs and outputs are dlarray objects, and M and N correspond to the NumOutputs and NumInputs properties, respectively.

    The inputs X1, …, XN correspond to the layer inputs with names given by InputNames. The outputs Y1, …, YM correspond to the layer outputs with names given by OutputNames.

    If the specified function is not accessible when you create the layer, you must specify the NumInputs and NumOutputs properties.

    The inputs and outputs of the predict function can be complex-valued. (since R2024a) If the layer outputs complex-valued data, then when you use the layer in a neural network, you must ensure that the subsequent layers or loss function support complex-valued input.

    Before R2024a: The inputs and outputs of the predict function must not be complex. If the predict function of the layer involves complex numbers, convert all outputs to real values before returning them.

    For a list of functions that support dlarray input, see List of Functions with dlarray Support.

    Tip

    When using the layer, you must ensure that the specified function is accessible. For example, to ensure that the layer can be reused in multiple live scripts, save the function in its own separate file.

    Data Types: function_handle

    This property is read-only.

    Flag indicating whether the layer function operates on formatted dlarray objects, specified as 0 (false) or 1 (true).

    Data Types: logical

    This property is read-only.

    Flag indicating whether the layer function supports acceleration using dlaccelerate, specified as 0 (false) or 1 (true).

    Tip

    Setting Acceleratable to 1 (true) can significantly improve the performance of training and inference (prediction) using a dlnetwork.

    Most simple functions support acceleration using dlaccelerate. For more information, see Deep Learning Function Acceleration for Custom Training Loops.

    Data Types: logical

    Layer

    Layer name, specified as a character vector or a string scalar. For Layer array input, the trainnet and dlnetwork functions automatically assign names to layers with the name "".

    The FunctionLayer object stores this property as a character vector.

    Data Types: char | string

    This property is read-only.

    One-line description of the layer, specified as a string scalar or a character vector. This description appears when the layer is displayed in a Layer array.

    If you do not specify a layer description, then the software displays the layer operation.

    Data Types: char | string

    This property is read-only.

    Number of inputs, specified as a positive integer.

    The layer must have a fixed number of inputs. If PredictFcn supports a variable number of input arguments using varargin, then you must specify the number of layer inputs using NumInputs.

    If you do not specify NumInputs, then the software sets NumInputs to nargin(PredictFcn).

    Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

    This property is read-only.

    Input names of the layer, specified as a string array or a cell array of character vectors.

    If you do not specify InputNames and NumInputs is 1, then the software sets InputNames to {'in'}. If you do not specify InputNames and NumInputs is greater than 1, then the software sets InputNames to {'in1',...,'inN'}, where N is the number of inputs.

    Data Types: string | cell

    This property is read-only.

    Number of outputs of the layer, specified as a positive integer.

    The layer must have a fixed number of outputs. If PredictFcn supports a variable number of output arguments, then you must specify the number of layer outputs using NumOutputs.

    If you do not specify NumOutputs, then the software sets NumOutputs to nargout(PredictFcn).

    Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

    This property is read-only.

    Output names of the layer, specified as a string array or a cell array of character vectors.

    If you do not specify OutputNames and NumOutputs is 1, then the software sets OutputNames to {'out'}. If you do not specify OutputNames and NumOutputs is greater than 1, then the software sets OutputNames to {'out1',...,'outM'}, where M is the number of outputs.

    Data Types: string | cell

    Examples

    collapse all

    Create a function layer object that applies the softsign operation to the input. The softsign operation is given by the function f(x)=x1+|x|.

    layer = functionLayer(@(X) X./(1 + abs(X)))
    layer = 
      FunctionLayer with properties:
    
                 Name: ''
           PredictFcn: @(X)X./(1+abs(X))
          Formattable: 0
        Acceleratable: 0
    
       Learnable Parameters
        No properties.
    
       State Parameters
        No properties.
    
    Use properties method to see a list of all properties.
    
    

    Include a softsign layer, specified as a function layer, in a layer array. Specify that the layer has the description "softsign".

    layers = [
        imageInputLayer([28 28 1])
        convolution2dLayer(5,20)
        functionLayer(@(X) X./(1 + abs(X)),Description="softsign")
        maxPooling2dLayer(2,Stride=2)
        fullyConnectedLayer(10)
        softmaxLayer]
    layers = 
      6x1 Layer array with layers:
    
         1   ''   Image Input       28x28x1 images with 'zerocenter' normalization
         2   ''   2-D Convolution   20 5x5 convolutions with stride [1  1] and padding [0  0  0  0]
         3   ''   Function          softsign
         4   ''   2-D Max Pooling   2x2 max pooling with stride [2  2] and padding [0  0  0  0]
         5   ''   Fully Connected   10 fully connected layer
         6   ''   Softmax           softmax
    

    Create a function layer that reformats input data with the format "CB" (channel, batch) to have the format "SBC" (spatial, batch, channel). To specify that the layer operates on formatted data, set the Formattable option to true. To specify that the layer function supports acceleration using dlaccelerate, set the Acceleratable option to true.

    layer = functionLayer(@(X) dlarray(X,"SBC"),Formattable=true,Acceleratable=true)
    layer = 
      FunctionLayer with properties:
    
                 Name: ''
           PredictFcn: @(X)dlarray(X,"SBC")
          Formattable: 1
        Acceleratable: 1
    
       Learnable Parameters
        No properties.
    
       State Parameters
        No properties.
    
    Use properties method to see a list of all properties.
    
    

    Include a function layer that reformats the input to have the format "SB" in a layer array. Set the layer description to "channel to spatial".

    layers = [
        featureInputLayer(10)
        functionLayer(@(X) dlarray(X,"SBC"),Formattable=true,Acceleratable=true,Description="channel to spatial")
        convolution1dLayer(3,16)]
    layers = 
      3x1 Layer array with layers:
    
         1   ''   Feature Input     10 features
         2   ''   Function          channel to spatial
         3   ''   1-D Convolution   16 3 convolutions with stride 1 and padding [0  0]
    

    In this network, the 1-D convolution layer convolves over the "S" (spatial) dimension of its input data. This operation is equivalent to convolving over the "C" (channel) dimension of the network input data.

    Convert the layer array to a dlnetwork object and pass a random array of data with the format "CB".

    dlnet = dlnetwork(layers);
    
    X = rand(10,64);
    dlX = dlarray(X,"CB");
    
    dlY = forward(dlnet,dlX);

    View the size and format of the output data.

    size(dlY)
    ans = 1×3
    
         8    16    64
    
    
    dims(dlY)
    ans = 
    'SCB'
    

    This example shows how to import the layers from a pretrained Keras network, replace the unsupported layers with function layers, and assemble the layers into a network ready for prediction.

    Import Keras Network

    Import the layers from a Keras network model. The network in "digitsNet.h5" classifies images of digits.

    filename = "digitsNet.h5";
    layers = importKerasLayers(filename,ImportWeights=true)
    Warning: 'importKerasLayers' is not recommended and will be removed in a future release. To import TensorFlow-Keras models, save using the SavedModel format and use importNetworkFromTensorFlow function.
    
    Warning: Unable to import layer. Keras layer 'Activation' with the specified settings is not supported. The problem was: Activation type 'softsign' is not supported.
    
    Warning: Unable to import layer. Keras layer 'Activation' with the specified settings is not supported. The problem was: Activation type 'softsign' is not supported.
    
    Warning: Unable to import some Keras layers, because they are not supported by the Deep Learning Toolbox. They have been replaced by placeholder layers. To find these layers, call the function findPlaceholderLayers on the returned object.
    
    layers = 
      13x1 Layer array with layers:
    
         1   'ImageInputLayer'               Image Input             28x28x1 images
         2   'conv2d'                        2-D Convolution         8 3x3x1 convolutions with stride [1  1] and padding [0  0  0  0]
         3   'conv2d_softsign'               Activation              Placeholder for 'Activation' Keras layer
         4   'max_pooling2d'                 2-D Max Pooling         2x2 max pooling with stride [2  2] and padding [0  0  0  0]
         5   'conv2d_1'                      2-D Convolution         16 3x3x8 convolutions with stride [1  1] and padding [0  0  0  0]
         6   'conv2d_1_softsign'             Activation              Placeholder for 'Activation' Keras layer
         7   'max_pooling2d_1'               2-D Max Pooling         2x2 max pooling with stride [2  2] and padding [0  0  0  0]
         8   'flatten'                       Keras Flatten           Flatten activations into 1-D assuming C-style (row-major) order
         9   'dense'                         Fully Connected         100 fully connected layer
        10   'dense_relu'                    ReLU                    ReLU
        11   'dense_1'                       Fully Connected         10 fully connected layer
        12   'dense_1_softmax'               Softmax                 softmax
        13   'ClassificationLayer_dense_1'   Classification Output   crossentropyex
    

    The Keras network contains some layers that are not supported by Deep Learning Toolbox. The importKerasLayers function displays a warning and replaces the unsupported layers with placeholder layers.

    Replace Placeholder Layers

    To replace the placeholder layers, first identify the names of the layers to replace. Find the placeholder layers using the findPlaceholderLayers function.

    placeholderLayers = findPlaceholderLayers(layers)
    placeholderLayers = 
      2x1 PlaceholderLayer array with layers:
    
         1   'conv2d_softsign'     Activation   Placeholder for 'Activation' Keras layer
         2   'conv2d_1_softsign'   Activation   Placeholder for 'Activation' Keras layer
    

    Replace the placeholder layers with function layers with function specified by the softsign function, listed at the end of the example.

    Create a function layer with function specified by the softsign function, attached to this example as a supporting file. To access this function, open this example as a live script. Set the layer description to "softsign".

    layer = functionLayer(@softsign,Description="softsign");

    Replace the layers using the replaceLayer function. To use the replaceLayer function, first convert the layer array to a layer graph.

    lgraph = layerGraph(layers);
    lgraph = replaceLayer(lgraph,"conv2d_softsign",layer);
    lgraph = replaceLayer(lgraph,"conv2d_1_softsign",layer);

    Specify Class Names

    If the imported classification layer does not contain the classes, then you must specify these before prediction. If you do not specify the classes, then the software automatically sets the classes to 1, 2, ..., N, where N is the number of classes.

    Find the index of the classification layer by viewing the Layers property of the layer graph.

    lgraph.Layers
    ans = 
      13x1 Layer array with layers:
    
         1   'ImageInputLayer'               Image Input             28x28x1 images
         2   'conv2d'                        2-D Convolution         8 3x3x1 convolutions with stride [1  1] and padding [0  0  0  0]
         3   'layer'                         Function                softsign
         4   'max_pooling2d'                 2-D Max Pooling         2x2 max pooling with stride [2  2] and padding [0  0  0  0]
         5   'conv2d_1'                      2-D Convolution         16 3x3x8 convolutions with stride [1  1] and padding [0  0  0  0]
         6   'layer_1'                       Function                softsign
         7   'max_pooling2d_1'               2-D Max Pooling         2x2 max pooling with stride [2  2] and padding [0  0  0  0]
         8   'flatten'                       Keras Flatten           Flatten activations into 1-D assuming C-style (row-major) order
         9   'dense'                         Fully Connected         100 fully connected layer
        10   'dense_relu'                    ReLU                    ReLU
        11   'dense_1'                       Fully Connected         10 fully connected layer
        12   'dense_1_softmax'               Softmax                 softmax
        13   'ClassificationLayer_dense_1'   Classification Output   crossentropyex
    

    The classification layer has the name 'ClassificationLayer_dense_1'. View the classification layer and check the Classes property.

    cLayer = lgraph.Layers(end)
    cLayer = 
      ClassificationOutputLayer with properties:
    
                Name: 'ClassificationLayer_dense_1'
             Classes: 'auto'
        ClassWeights: 'none'
          OutputSize: 'auto'
    
       Hyperparameters
        LossFunction: 'crossentropyex'
    
    

    Because the Classes property of the layer is "auto", you must specify the classes manually. Set the classes to 0, 1, ..., 9, and then replace the imported classification layer with the new one.

    cLayer.Classes = string(0:9);
    lgraph = replaceLayer(lgraph,"ClassificationLayer_dense_1",cLayer);

    Assemble Network

    Assemble the layer graph using assembleNetwork. The function returns a DAGNetwork object that is ready to use for prediction.

    net = assembleNetwork(lgraph)
    net = 
      DAGNetwork with properties:
    
             Layers: [13x1 nnet.cnn.layer.Layer]
        Connections: [12x2 table]
         InputNames: {'ImageInputLayer'}
        OutputNames: {'ClassificationLayer_dense_1'}
    
    

    Test Network

    Make predictions with the network using a test data set.

    [XTest,YTest] = digitTest4DArrayData;
    YPred = classify(net,XTest);

    View the accuracy.

    mean(YPred == YTest)
    ans = 0.9900
    

    Visualize the predictions in a confusion matrix.

    confusionchart(YTest,YPred)

    Algorithms

    expand all

    Extended Capabilities

    Version History

    Introduced in R2021b

    expand all