# crossChannelNormalizationLayer

Channel-wise local response normalization layer

## Description

A channel-wise local response (cross-channel) normalization layer carries out channel-wise normalization.

## Creation

### Syntax

``layer = crossChannelNormalizationLayer(windowChannelSize)``
``layer = crossChannelNormalizationLayer(windowChannelSize,Name,Value)``

### Description

````layer = crossChannelNormalizationLayer(windowChannelSize)` creates a channel-wise local response normalization layer and sets the `WindowChannelSize` property.```

example

````layer = crossChannelNormalizationLayer(windowChannelSize,Name,Value)` sets the optional properties `WindowChannelSize`, `Alpha`, `Beta`, `K`, and `Name` using name-value pairs. For example, `crossChannelNormalizationLayer(5,'K',1)` creates a local response normalization layer for channel-wise normalization with a window size of 5 and K hyperparameter 1. You can specify multiple name-value pairs. Enclose each property name in single quotes.```

## Properties

expand all

### Cross-Channel Normalization

Size of the channel window, which controls the number of channels that are used for the normalization of each element, specified as a positive integer.

If `WindowChannelSize` is even, then the window is asymmetric. The software looks at the previous `floor((w-1)/2)` channels and the following `floor(w/2)` channels. For example, if `WindowChannelSize` is 4, then the layer normalizes each element by its neighbor in the previous channel and by its neighbors in the next two channels.

Example: `5`

α hyperparameter in the normalization (the multiplier term), specified as a numeric scalar.

Example: `0.0002`

β hyperparameter in the normalization, specified as a numeric scalar. The value of `Beta` must be greater than or equal to 0.01.

Example: `0.8`

K hyperparameter in the normalization, specified as a numeric scalar. The value of `K` must be greater than or equal to 10-5.

Example: `2.5`

### Layer

Layer name, specified as a character vector or a string scalar. For `Layer` array input, the `trainNetwork`, `assembleNetwork`, `layerGraph`, and `dlnetwork` functions automatically assign names to layers with name `''`.

Data Types: `char` | `string`

Number of inputs of the layer. This layer accepts a single input only.

Data Types: `double`

Input names of the layer. This layer accepts a single input only.

Data Types: `cell`

Number of outputs of the layer. This layer has a single output only.

Data Types: `double`

Output names of the layer. This layer has a single output only.

Data Types: `cell`

## Examples

collapse all

Create a local response normalization layer for channel-wise normalization, where a window of five channels normalizes each element, and the additive constant for the normalizer $\mathit{K}$ is 1.

`layer = crossChannelNormalizationLayer(5,'K',1)`
```layer = CrossChannelNormalizationLayer with properties: Name: '' Hyperparameters WindowChannelSize: 5 Alpha: 1.0000e-04 Beta: 0.7500 K: 1 ```

Include a local response normalization layer in a `Layer` array.

```layers = [ ... imageInputLayer([28 28 1]) convolution2dLayer(5,20) reluLayer crossChannelNormalizationLayer(3) fullyConnectedLayer(10) softmaxLayer classificationLayer]```
```layers = 7x1 Layer array with layers: 1 '' Image Input 28x28x1 images with 'zerocenter' normalization 2 '' Convolution 20 5x5 convolutions with stride [1 1] and padding [0 0 0 0] 3 '' ReLU ReLU 4 '' Cross Channel Normalization cross channel normalization with 3 channels per element 5 '' Fully Connected 10 fully connected layer 6 '' Softmax softmax 7 '' Classification Output crossentropyex ```

## Limitations

• This layer does not support 3-D image inputs or vector sequence inputs.

expand all

## References

[1] Krizhevsky, A., I. Sutskever, and G. E. Hinton. "ImageNet Classification with Deep Convolutional Neural Networks." Advances in Neural Information Processing Systems. Vol 25, 2012.

## Version History

Introduced in R2016a