Cross entropy error computation
1 visualizzazione (ultimi 30 giorni)
Mostra commenti meno recenti
Hi There,
I trained a model (classifier) to recognize the hand written digits from (0-9). The model is not related to any CNN types of Networks.
I tested the model prformance using accuracy, precsion and Recall. I am looking to do some error measurments analysis using cross entropy error.
my question is: given the actual label output, the predicted output of the model, how to find the cross entropy:
Example:
the target = [9 5 6 7 8 5 0 1]
the model output (predicted) = [ 5 5 9 7 8 5 0 1]
i want to use this information (only the target and the model output) to find the loss in the model.
i tried the dlarray given in the example: https://www.mathworks.com/help/deeplearning/ref/dlarray.crossentropy.html , but i dont think I am doing that in the right way, any suggestions!
0 Commenti
Risposte (1)
Hiro Yoshino
il 23 Dic 2020
I believe that can be easier:
the function "classify" returns the label together with the probabilities for all the categories.
you can uses the probablities to work out the cross entropy.
In the case that function does not meet your request -
use activations function so you can see the output of hidden layers.
you can use the probability of the output
0 Commenti
Vedere anche
Categorie
Scopri di più su Deep Learning Toolbox in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!