Risultati per
I'm working on training neural networks without backpropagation / automatic differentiation, using locally derived analytic forms of update rules. Given that this allows a direct formula to be derived for the update rule, it removes alot of the overhead that is otherwise required from automatic differentiation.
However, matlab's functionalities for neural networks are currently solely based around backpropagation and automatic differentiation, such as the dlgradient function and requiring everything to be dlarrays during training.
I have two main requests, specifically for functions that perform a single operation within a single layer of a neural network, such as "dlconv", "fullyconnect", "maxpool", "avgpool", "relu", etc:
- these functions should also allow normal gpuArray data instead of requiring everything to be dlarrays.
- these functions are currently designed to only perform the forward pass. I request that these also be designed to perform the backward pass if user requests. There can be another input user flag that can be "forward" (default) or "backward", and then the function should have all the necessary inputs to perform that operation (e.g. for "avgpool" forward pass it only needs the avgpool input data and the avgpool parameters, but for the "avgpool" backward pass it needs the deriviative w.r.t. the avgpool output data, the avgpool parameters, and the original data dimensions). I know that there is a maxunpool function that achieves this for maxpool, but it has significant issues when trying to use it this way instead of by backpropagation in a dlgradient type layer, see (https://www.mathworks.com/matlabcentral/answers/2179587-making-a-custom-way-to-train-cnns-and-i-am-noticing-that-avgpool-is-significantly-faster-than-maxpo?s_tid=srchtitle).
I don't know how many people would benefit from this feature, and someone could always spend their time creating these functionalities themselves by matlab scripts, cuDNN mex, etc., but regardless it would be nice for matlab to have this allowable for more customizable neural net training.
Hello to all!
I would like to share with the Matlab and Simulink community this video about Neural Networks in Simulink.
This is a series of videos that use a multilayer perceptron implemented in Simulink as a case study. Why Simulink? Because it's a visual and intuitive modeling tool, you can see the forward propagation of this network and better understand the flow. The objective of this series is to show the implementation using Simulink for both simulation and Arduino, as well as its training using Matlab and Matlab with Deep Learning Toolbox, and a video of training with Python.
The video is in Spanish, but the Simulink model is available in English for the entire community; subtitles are also available.
The files are located in the first comment of each video. We hope you find it interesting and enjoyable. Best regards!
Here I share the link to the first video.