matchzoo.metrics.average_precision
¶
Average precision metric for ranking.
Module Contents¶
-
class
matchzoo.metrics.average_precision.
AveragePrecision
(threshold:float=0.0)¶ Bases:
matchzoo.engine.base_metric.RankingMetric
Average precision metric.
-
ALIAS
= ['average_precision', 'ap']¶
-
__repr__
(self)¶ Returns: Formated string representation of the metric.
-
__call__
(self, y_true:np.array, y_pred:np.array)¶ Calculate average precision (area under PR curve).
Example
>>> y_true = [0, 1] >>> y_pred = [0.1, 0.6] >>> round(AveragePrecision()(y_true, y_pred), 2) 0.75 >>> round(AveragePrecision()([], []), 2) 0.0
Parameters: - y_true – The ground true label of each document.
- y_pred – The predicted scores of each document.
Returns: Average precision.
-