Report for a comparison of EstimatorReport#

The class ComparisonReport provides a report allowing to compare EstimatorReport instances in an interactive way. The functionalities of the report are accessible through accessors.

ComparisonReport(reports, *[, n_jobs])

Report for comparing reports.

Methods

ComparisonReport.help()

Display available methods using rich.

ComparisonReport.cache_predictions([...])

Cache the predictions for sub-estimators reports.

ComparisonReport.clear_cache()

Clear the cache.

ComparisonReport.get_predictions(*, data_source)

Get predictions from the underlying reports.

Accessors

ComparisonReport.metrics

Accessor for metrics-related operations.

Metrics#

The metrics accessor helps you to evaluate the statistical performance of the compared estimators. In addition, we provide a sub-accessor plot, to get the common performance metric representations.

ComparisonReport.metrics.help()

Display available methods using rich.

ComparisonReport.metrics.summarize(*[, ...])

Report a set of metrics for the estimators.

ComparisonReport.metrics.custom_metric(...)

Compute a custom metric.

ComparisonReport.metrics.accuracy(*[, ...])

Compute the accuracy score.

ComparisonReport.metrics.brier_score(*[, ...])

Compute the Brier score.

ComparisonReport.metrics.log_loss(*[, ...])

Compute the log loss.

ComparisonReport.metrics.precision(*[, ...])

Compute the precision score.

ComparisonReport.metrics.r2(*[, ...])

Compute the R² score.

ComparisonReport.metrics.recall(*[, ...])

Compute the recall score.

ComparisonReport.metrics.rmse(*[, ...])

Compute the root mean squared error.

ComparisonReport.metrics.roc_auc(*[, ...])

Compute the ROC AUC score.

ComparisonReport.metrics.timings([aggregate])

Get all measured processing times related to the different estimators.