lenskit.basic.random#

Classes

RandomConfig(*[, n, rng])

RandomSelector([config])

Randomly select items from a candidate list.

SoftmaxRanker([config])

Stochastic top-N ranking with softmax sampling.

class lenskit.basic.random.RandomConfig(*, n=None, rng=None)#

Bases: BaseModel

Parameters:
  • n (int | None)

  • rng (int | Sequence[int] | None | Literal['user'] | tuple[int | ~typing.Sequence[int] | None, ~typing.Literal['user']])

n: int | None#

The number of items to select. -1 or None to return all scored items.

rng: int | Sequence[int] | None | Literal['user'] | tuple[int | Sequence[int] | None, Literal['user']]#

Random number generator configuration.

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class lenskit.basic.random.RandomSelector(config=None, **kwargs)#

Bases: Component[ItemList, …]

Randomly select items from a candidate list.

Stability:
Caller (see Stability Levels).
Parameters:
  • n – The number of items to select, or -1 to randomly permute the items.

  • rng – The random number generator or specification (see Random Seeds). This class supports derivable RNGs.

  • config (RandomConfig)

class lenskit.basic.random.SoftmaxRanker(config=None, **kwargs)#

Bases: Component[ItemList, …]

Stochastic top-N ranking with softmax sampling.

This uses the “softmax” sampling method, a more efficient approximation of Plackett-Luce sampling than even the Gumbell trick, as documented by `Tim Vieira`_. It expects a scored list of input items, and samples n items, with selection probabilities proportional to their scores.

Note

Negative scores are clamped to (approximately) zero.

Stability:
Caller (see Stability Levels).
Parameters:
  • n – The number of items to return (-1 to return unlimited).

  • rng – The random number generator or specification (see Random Seeds). This class supports derivable RNGs.

  • config (RandomConfig)