Precision Recall Plot given the ground truth, predicted label, and predicted score

1 visualizzazione (ultimi 30 giorni)
How can i get the precision recall plot for this ? I know of the function at http://www.mathworks.com/help/stats/perfcurve.html and http://www.mathworks.com/matlabcentral/fileexchange/21528-precision-recall-and-roc-curves but the issue is that the inputs are the true class labels and the predicted scores.
For example. (I have edited my question. This is my actual real example. All detections positive classes.)
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 1 1 1 ]
predicted_scores = [ 10 9 8 7 6 5 ] (scores for corresponding label)
If I set threshold at 6, then I get 3 false positives and 2 true positives.
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 1 1 0 ]
If I set threshold at 8, then I get 2 false positives and 1 true positives.
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 0 0 0 ]
  3 Commenti
RuiQi
RuiQi il 8 Lug 2016
Modificato: RuiQi il 8 Lug 2016
Yes they are a measure of certainty and that they happen to be arranged in descending order.
And aren't the precision and recall plots based on the scores ? A higher threshold would lead to lower false positives but at the same time lower true positives. So the precision-recall plot indirectly shows the performance of the detector at varied thresholds.

Accedi per commentare.

Risposte (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by