Evaluation metrics for deep learning model model

3 visualizzazioni (ultimi 30 giorni)
What is the command to be used for computing the evaluation metrics for a deep learning model such as precision, recall, specificity, F1 score.
Should it explicitly computed from the Confusion matrix by using the standard formulas or can it be directly computed in the code and displayed.
Also are these metrics computed on the Validation dataset.
Kindly provide inputs regarding the above.

Risposta accettata

Pranjal Kaura
Pranjal Kaura il 23 Nov 2021
Modificato: Pranjal Kaura il 23 Nov 2021
Hey Sushma,
Thank you for bringing this up. The concerned parties are looking at this issue and will try to roll it in future releases.
For now you can compute these metrics using the confusion matrix. You can refer to this link.
Hope this helps!
  2 Commenti
Sushma TV
Sushma TV il 25 Nov 2021
Thanks Pranjal. I went through the link that you sent but have a doubt in plotting Precision and Recall plots. Computation of values using Confusion matrix was possible but could not figure out the plots. What are the arguments of the function perfcurves to plot Precision- Recall curve?
Pranjal Kaura
Pranjal Kaura il 26 Nov 2021
'perfcurve' is used for plotting performance curves on classifier outputs. To plot a Precision-Recall curve you can set the 'XCrit' (Criterion to compute 'X') and YCrit to 'reca' and 'prec' respectively, to compute recall and precision. You can refer the following code snippet:
[X, Y] = perfcurve(labels, scores, posclass, 'XCrit', 'reca', 'YCrit', 'prec');

Accedi per commentare.

Più risposte (0)

Prodotti


Release

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by