Main Content

complexReluLayer

Complex rectified linear unit (ReLU) layer

Since R2025a

    Description

    A complex ReLU layer performs a threshold operation to the real and imaginary parts of the input, where any value less than zero is set to zero.

    Creation

    Description

    layer = complexReluLayer creates a complex ReLU layer.

    example

    layer = complexReluLayer(Name=name) also sets the optional Name property using a name-value argument.

    Properties

    expand all

    Layer name, specified as a character vector or a string scalar. For Layer array input, the trainnet and dlnetwork functions automatically assign names to unnamed layers.

    The ComplexReLULayer object stores this property as a character vector.

    Data Types: char | string

    This property is read-only.

    Number of inputs to the layer, stored as 1. This layer accepts a single input only.

    Data Types: double

    This property is read-only.

    Input names, stored as {'in'}. This layer accepts a single input only.

    Data Types: cell

    This property is read-only.

    Number of outputs from the layer, stored as 1. This layer has a single output only.

    Data Types: double

    This property is read-only.

    Output names, stored as {'out'}. This layer has a single output only.

    Data Types: cell

    Examples

    collapse all

    Create a complex ReLU layer.

    layer = complexReluLayer
    layer = 
      ComplexReLULayer with properties:
    
        Name: ''
    
       Learnable Parameters
        No properties.
    
       State Parameters
        No properties.
    
      Show all properties
    
    

    Algorithms

    expand all

    Version History

    Introduced in R2025a