This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

eluLayer

Exponential linear unit (ELU) layer

Description

An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.

The layer performs the following operation:

f(x)={x,x0α(exp(x) - 1),x<0

The default value of α is 1. Specify a value of α for the layer by setting the Alpha property.

Creation

Syntax

layer = eluLayer
layer = eluLayer(alpha)
layer = eluLayer(___,'Name',Name)

Description

layer = eluLayer creates an ELU layer.

layer = eluLayer(alpha) creates an ELU layer and specifies the Alpha property.

example

layer = eluLayer(___,'Name',Name) additionally sets the optional Name property using any of the previous syntaxes. For example, eluLayer('Name','elu1') creates an ELU layer with the name 'elu1'.

Properties

expand all

ELU

Nonlinearity parameter α, specified as a numeric scalar. The minimum value of the output of the ELU layer equals and the slope at negative inputs approaching 0 is α.

Layer

Layer name, specified as a character vector or a string scalar. To include a layer in a layer graph, you must specify a nonempty unique layer name. If you train a series network with the layer and Name is set to '', then the software automatically assigns a name to the layer at training time.

Data Types: char | string

Number of inputs of the layer. This layer accepts a single input only.

Data Types: double

Input names of the layer. This layer accepts a single input only.

Data Types: cell

Number of outputs of the layer. This layer has a single output only.

Data Types: double

Output names of the layer. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create an exponential linear unit (ELU) layer with the name 'elu1' and a default value of 1 for the nonlinearity parameter Alpha.

layer = eluLayer('Name','elu1')
layer = 
  ELULayer with properties:

     Name: 'elu1'
    Alpha: 1

  Show all properties

Include an ELU layer in a Layer array.

layers = [
    imageInputLayer([28 28 1])
    convolution2dLayer(3,16)
    batchNormalizationLayer
    eluLayer
    
    maxPooling2dLayer(2,'Stride',2)
    convolution2dLayer(3,32)
    batchNormalizationLayer
    eluLayer
    
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer]
layers = 
  11x1 Layer array with layers:

     1   ''   Image Input             28x28x1 images with 'zerocenter' normalization
     2   ''   Convolution             16 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   Batch Normalization     Batch normalization
     4   ''   ELU                     ELU with Alpha 1
     5   ''   Max Pooling             2x2 max pooling with stride [2  2] and padding [0  0  0  0]
     6   ''   Convolution             32 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     7   ''   Batch Normalization     Batch normalization
     8   ''   ELU                     ELU with Alpha 1
     9   ''   Fully Connected         10 fully connected layer
    10   ''   Softmax                 softmax
    11   ''   Classification Output   crossentropyex

References

[1] Clevert, Djork-Arné, Thomas Unterthiner, and Sepp Hochreiter. "Fast and accurate deep network learning by exponential linear units (ELUs)." arXiv preprint arXiv:1511.07289 (2015).

Introduced in R2019a