ride.metrics
¶
Module Contents¶
Classes¶
Generic enumeration. |
|
Abstract base class for Ride modules |
|
Mean Average Precision (mAP) metric |
|
Computes Floating Point Operations (FLOPs) for the model and adds it as metric |
|
Computes acc * (flops / target_gflops) ** (-0.07) |
Functions¶
|
|
|
|
|
|
|
Given the predictions, labels, and a list of top-k values, compute the |
|
Computes the top-k error for each k. |
|
Computes the top-k accuracy for each k. |
|
Compute the Floating Point Operations per Second for the model |
|
Compute the number of parameters. |
|
Attributes¶
- class ride.metrics.OptimisationDirection[source]¶
Bases:
enum.Enum
Generic enumeration.
Derive from this class to define new enumerations.
- class ride.metrics.MetricMixin(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
ride.core.RideMixin
Abstract base class for Ride modules
- metrics_epoch(preds: torch.Tensor, targets: torch.Tensor, prefix: str = '', *args, **kwargs) MetricDict [source]¶
- collect_metrics(preds: torch.Tensor, targets: torch.Tensor) MetricDict [source]¶
- collect_epoch_metrics(preds: torch.Tensor, targets: torch.Tensor, prefix: str = None) ExtendedMetricDict [source]¶
- ride.metrics.MetricSelector(mapping: Dict[str, Union[MetricMixin, Iterable[MetricMixin]]] = None, default_config: str = '', **kwargs: Union[MetricMixin, Iterable[MetricMixin]]) MetricMixin [source]¶
- class ride.metrics.MeanAveragePrecisionMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
MetricMixin
Mean Average Precision (mAP) metric
- metrics_step(preds: torch.Tensor, targets: torch.Tensor, *args, **kwargs) MetricDict [source]¶
- metrics_epoch(preds: torch.Tensor, targets: torch.Tensor, *args, **kwargs) MetricDict [source]¶
- ride.metrics.TopKAccuracyMetric(*Ks) MetricMixin [source]¶
- class ride.metrics.FlopsMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
MetricMixin
Computes Floating Point Operations (FLOPs) for the model and adds it as metric
- metrics_step(preds: torch.Tensor, targets: torch.Tensor, **kwargs) MetricDict [source]¶
- class ride.metrics.FlopsWeightedAccuracyMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
FlopsMetric
Computes acc * (flops / target_gflops) ** (-0.07)
- static configs() ride.core.Configs [source]¶
- metrics_step(preds: torch.Tensor, targets: torch.Tensor, **kwargs) MetricDict [source]¶
- ride.metrics.topks_correct(preds: torch.Tensor, labels: torch.Tensor, ks: List[int]) List[torch.Tensor] [source]¶
Given the predictions, labels, and a list of top-k values, compute the number of correct predictions for each top-k value.
- Parameters:
preds (array) – array of predictions. Dimension is batchsize N x ClassNum.
labels (array) – array of labels. Dimension is batchsize N.
ks (list) – list of top-k values. For example, ks = [1, 5] correspods to top-1 and top-5.
- Returns:
- list of numbers, where the i-th entry
corresponds to the number of top-ks[i] correct predictions.
- Return type:
topks_correct (list)
- ride.metrics.topk_errors(preds: torch.Tensor, labels: torch.Tensor, ks: List[int])[source]¶
Computes the top-k error for each k. :param preds: array of predictions. Dimension is N. :type preds: array :param labels: array of labels. Dimension is N. :type labels: array :param ks: list of ks to calculate the top accuracies. :type ks: list
- ride.metrics.topk_accuracies(preds: torch.Tensor, labels: torch.Tensor, ks: List[int])[source]¶
Computes the top-k accuracy for each k. :param preds: array of predictions. Dimension is N. :type preds: array :param labels: array of labels. Dimension is N. :type labels: array :param ks: list of ks to calculate the top accuracies. :type ks: list
- ride.metrics.flops(model: torch.nn.Module)[source]¶
Compute the Floating Point Operations per Second for the model
- ride.metrics.params_count(model: torch.nn.Module)[source]¶
Compute the number of parameters. :param model: model to count the number of parameters. :type model: model
- ride.metrics.make_confusion_matrix(preds: torch.Tensor, targets: torch.Tensor, classes: List[str]) matplotlib.figure.Figure [source]¶