How to display weight distribution in hidden layers of neural network?

5 visualizzazioni (ultimi 30 giorni)
I have 8 inputs in the input layer.Now i want to display weight distribution of these 8 inputs in hidden layer to observe the importance of features.To make it more clear example is shown in figure ( ).I used `plotwb` function of Matlab it didn't display the weights of every input.
Actually i want to look at weights(weights connecting inputs to first hidden layer) . Larger the weight is, the more important the input.

Risposte (1)

Greg Heath
Greg Heath il 17 Set 2017
That will not work. It does not account for the correlations between inputs.
The best way to rank correlated inputs is
2. Run 10 or more trials each (different random initial weights)
a. A single input
b. All inputs except the one in a.
Hope this helps.
Thank you for formally accepting my answer

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by