Main Content

eluLayer

Exponential linear unit (ELU) layer

Since R2019a

Description

An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.

The layer performs the following operation:

f(x)={x,x0α(exp(x) - 1),x<0

The default value of α is 1. Specify a value of α for the layer by setting the Alpha property.

Creation

Description

layer = eluLayer creates an ELU layer.

layer = eluLayer(alpha) creates an ELU layer and specifies the Alpha property.

example

layer = eluLayer(___,'Name',Name) additionally sets the optional Name property using any of the previous syntaxes. For example, eluLayer('Name','elu1') creates an ELU layer with the name 'elu1'.

Properties

expand all

ELU

Nonlinearity parameter α, specified as a numeric scalar. The minimum value of the output of the ELU layer equals and the slope at negative inputs approaching 0 is α.

Layer

Layer name, specified as a character vector or a string scalar. For Layer array input, the trainNetwork, assembleNetwork, layerGraph, and dlnetwork functions automatically assign names to layers with the name ''.

Data Types: char | string

This property is read-only.

Number of inputs of the layer. This layer accepts a single input only.

Data Types: double

This property is read-only.

Input names of the layer. This layer accepts a single input only.

Data Types: cell

This property is read-only.

Number of outputs of the layer. This layer has a single output only.

Data Types: double

This property is read-only.

Output names of the layer. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create an exponential linear unit (ELU) layer with the name 'elu1' and a default value of 1 for the nonlinearity parameter Alpha.

layer = eluLayer('Name','elu1')
layer = 
  ELULayer with properties:

     Name: 'elu1'
    Alpha: 1

   Learnable Parameters
    No properties.

   State Parameters
    No properties.

  Show all properties

Include an ELU layer in a Layer array.

layers = [
    imageInputLayer([28 28 1])
    convolution2dLayer(3,16)
    batchNormalizationLayer
    eluLayer
    
    maxPooling2dLayer(2,'Stride',2)
    convolution2dLayer(3,32)
    batchNormalizationLayer
    eluLayer
    
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer]
layers = 
  11x1 Layer array with layers:

     1   ''   Image Input             28x28x1 images with 'zerocenter' normalization
     2   ''   2-D Convolution         16 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   Batch Normalization     Batch normalization
     4   ''   ELU                     ELU with Alpha 1
     5   ''   2-D Max Pooling         2x2 max pooling with stride [2  2] and padding [0  0  0  0]
     6   ''   2-D Convolution         32 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     7   ''   Batch Normalization     Batch normalization
     8   ''   ELU                     ELU with Alpha 1
     9   ''   Fully Connected         10 fully connected layer
    10   ''   Softmax                 softmax
    11   ''   Classification Output   crossentropyex

Algorithms

expand all

References

[1] Clevert, Djork-Arné, Thomas Unterthiner, and Sepp Hochreiter. "Fast and accurate deep network learning by exponential linear units (ELUs)." arXiv preprint arXiv:1511.07289 (2015).

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Version History

Introduced in R2019a