lenskit.metrics.Precision#

class lenskit.metrics.Precision(k=None)#

Bases: ListMetric, RankingMetricBase

Compute recommendation precision. This is computed as:

\[\frac{|L \cap I_u^{\mathrm{test}}|}{|L|}\]

In the uncommon case that k is specified and len(recs) < k, this metric uses len(recs) as the denominator.

Parameters:

k (int | None)

__init__(k=None)#
Parameters:

k (int | None)

Methods

__init__([k])

measure_list(recs, test)

Compute the metric value for a single result list.

truncate(items)

Truncate an item list if it is longer than k.

Attributes

default

The default value to infer when computing statistics over missing values.

k

The maximum length of rankings to consider.

label

The metric's default label in output.

property label#

The metric’s default label in output.

The base implementation returns the class name by default.

measure_list(recs, test)#

Compute the metric value for a single result list.

Individual metric classes need to implement this method.

Parameters:
Return type:

float