CrossValidationReport.metrics.precision_recall#

CrossValidationReport.metrics.precision_recall(*, data_source='test', X=None, y=None, pos_label=<DEFAULT>)[source]#

Plot the precision-recall curve.

Parameters:
data_source{“test”, “train”, “X_y”}, default=”test”

The data source to use.

  • “test” : use the test set provided when creating the report.

  • “train” : use the train set provided when creating the report.

  • “X_y” : use the provided X and y to compute the metric.

Xarray-like of shape (n_samples, n_features), default=None

New data on which to compute the metric. By default, we use the validation set provided when creating the report.

yarray-like of shape (n_samples,), default=None

New target on which to compute the metric. By default, we use the target provided when creating the report.

pos_labelint, float, bool, str or None default=_DEFAULT

The label to consider as the positive class when computing the metric. Use this parameter to override the positive class. By default, the positive class is set to the one provided when creating the report. If None, the metric is computed considering each class as a positive class.

Returns:
PrecisionRecallCurveDisplay

The precision-recall curve display.

Examples

>>> from sklearn.datasets import load_breast_cancer
>>> from sklearn.linear_model import LogisticRegression
>>> from skore import CrossValidationReport
>>> X, y = load_breast_cancer(return_X_y=True)
>>> classifier = LogisticRegression(max_iter=10_000)
>>> report = CrossValidationReport(classifier, X=X, y=y, cv_splitter=2)
>>> display = report.metrics.precision_recall()
>>> display.plot()