Hello. You need to return an array of the same data type and storage type as the input. Somehow your data is no longer on the GPU and no longer a dlarray, this means your network will not train unless you have implemented a backward function. Show us the rest of your layer code, including whether you are using the Formattable mixin, whether you have implemented a backward function, and whether you are training with trainnet or trainNetwork. Thanks.
Straightforwardly, making the output a single gpuArray dlarray ( dlarray(gpuArray(single(x)), "CB")) will fix this error, but probably there will be other errors and this will prevent your layer from working for CPU training so I expect it isn't the right solution.