ride
¶
Subpackages¶
Submodules¶
Package Contents¶
Classes¶
Complete main programme for the lifecycle of a machine learning project |
|
Configs module for holding project configurations. |
|
Base-class for Ride classification datasets. |
|
Base-class for Ride datasets. |
|
Base-class for modules using the Ride ecosystem. |
|
Adds finetune capabilities to model |
|
Adds train, val, and test lifecycle methods with cross_entropy loss |
|
Computes Floating Point Operations (FLOPs) for the model and adds it as metric |
|
Computes acc * (flops / target_gflops) ** (-0.07) |
|
Mean Average Precision (mAP) metric |
|
Abstract base-class for Optimizer mixins |
|
Abstract base-class for Optimizer mixins |
|
Abstract base-class for Optimizer mixins |
|
Abstract base-class for Optimizer mixins |
Functions¶
|
|
|
|
|
- class ride.Main(Module: Type[ride.core.RideModule])[source]¶
Complete main programme for the lifecycle of a machine learning project
- Usage:
Main(YourRideModule).argparse()
- class ride.Configs[source]¶
Bases:
corider.Configs
Configs module for holding project configurations.
This is a wrapper of the Configs found as a stand-alone package in https://github.com/LukasHedegaard/co-rider
- static collect(cls: RideModule) Configs [source]¶
Collect the configs from all class bases
- Returns:
Aggregated configurations
- Return type:
- class ride.RideClassificationDataset(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
RideDataset
Base-class for Ride classification datasets.
If no dataset is specified otherwise, this mixin is automatically add as a base of RideModule childen.
User-specified datasets must inherit from this class, and specify the following: - self.input_shape: Union[int, Sequence[int], Sequence[Sequence[int]]] - self.output_shape: Union[int, Sequence[int], Sequence[Sequence[int]]] - self.classes: List[str]
and either the functions: - train_dataloader: Callable[[Any], DataLoader] - val_dataloader: Callable[[Any], DataLoader] - test_dataloader: Callable[[Any], DataLoader]
or: - self.datamodule, which has train_dataloader, val_dataloader, and test_dataloader attributes.
- metrics_epoch(preds: torch.Tensor, targets: torch.Tensor, prefix: str = None, *args, **kwargs)[source]¶
- class ride.RideDataset(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
RideMixin
Base-class for Ride datasets.
If no dataset is specified otherwise, this mixin is automatically add as a base of RideModule childen.
User-specified datasets must inherit from this class, and specify the following: - self.input_shape: Union[int, Sequence[int], Sequence[Sequence[int]]] - self.output_shape: Union[int, Sequence[int], Sequence[Sequence[int]]]
and either the functions: - train_dataloader: Callable[[Any], DataLoader] - val_dataloader: Callable[[Any], DataLoader] - test_dataloader: Callable[[Any], DataLoader]
or: - self.datamodule, which has train_dataloader, val_dataloader, and test_dataloader attributes.
- input_shape: DataShape¶
- output_shape: DataShape¶
- train_dataloader(*args: Any, **kwargs: Any) torch.utils.data.DataLoader [source]¶
The train dataloader
- val_dataloader(*args: Any, **kwargs: Any) Union[torch.utils.data.DataLoader, List[torch.utils.data.DataLoader]] [source]¶
The val dataloader
- test_dataloader(*args: Any, **kwargs: Any) Union[torch.utils.data.DataLoader, List[torch.utils.data.DataLoader]] [source]¶
The test dataloader
- class ride.RideModule[source]¶
Base-class for modules using the Ride ecosystem.
This module should be inherited as the highest-priority parent (first in sequence).
Example:
class MyModule(ride.RideModule, ride.SgdOneCycleOptimizer): def __init__(self, hparams): ...
It handles proper initialisation of RideMixin parents and adds automatic attribute validation.
If pytorch_lightning.LightningModule is omitted as lowest-priority parent, RideModule will automatically add it.
If training_step, validation_step, and test_step methods are not found, the ride.Lifecycle will be automatically mixed in by this module.
- property hparams: pytorch_lightning.utilities.parsing.AttributeDict¶
- classmethod with_dataset(ds: RideDataset)[source]¶
- class ride.Finetunable(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
ride.unfreeze.Unfreezable
Adds finetune capabilities to model
This module is automatically added when RideModule is inherited
- hparams: Ellipsis¶
- static configs() ride.core.Configs [source]¶
- class ride.Hparamsearch(Module: Type[ride.core.RideModule])[source]¶
- configs() ride.core.Configs [source]¶
- run(args: pytorch_lightning.utilities.parsing.AttributeDict)[source]¶
Run hyperparameter search using the tune.schedulers.ASHAScheduler
- Parameters:
args (AttributeDict) – Arguments
- Side-effects:
Saves logs to TUNE_LOGS_PATH / args.id
- static dump(hparams: dict, identifier: str, extention='yaml') str [source]¶
Dumps haparams to TUNE_LOGS_PATH / identifier / “best_hparams.json”
- static load(path: Union[pathlib.Path, str], old_args=AttributeDict(), Cls: Type[ride.core.RideModule] = None, auto_scale_lr=False) pytorch_lightning.utilities.parsing.AttributeDict [source]¶
Loads hparams from path
- Parameters:
path (Union[Path, str]) – Path to jsonfile containing hparams
old_args (Optional[AttributeDict]) – The AttributeDict to be updated with the new hparams
cls (Optional[RideModule]) – A class whole hyperparameters can be used to select the relevant hparams to take
- Returns:
AttributeDict with updated hyperparameters
- Return type:
AttributeDict
- class ride.Lifecycle(hparams=None, *args, **kwargs)[source]¶
Bases:
ride.metrics.MetricMixin
Adds train, val, and test lifecycle methods with cross_entropy loss
During its traning_epoch_end(epoch) lifecycle method, it will call on_traning_epoch_end for all superclasses of its child class
- hparams: Ellipsis¶
- forward: Callable[[torch.Tensor], torch.Tensor]¶
- static configs() ride.core.Configs [source]¶
- metrics_step(preds: torch.Tensor, targets: torch.Tensor, **kwargs) ride.metrics.MetricDict [source]¶
- class ride.FlopsMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
MetricMixin
Computes Floating Point Operations (FLOPs) for the model and adds it as metric
- metrics_step(preds: torch.Tensor, targets: torch.Tensor, **kwargs) MetricDict [source]¶
- class ride.FlopsWeightedAccuracyMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
FlopsMetric
Computes acc * (flops / target_gflops) ** (-0.07)
- static configs() ride.core.Configs [source]¶
- metrics_step(preds: torch.Tensor, targets: torch.Tensor, **kwargs) MetricDict [source]¶
- class ride.MeanAveragePrecisionMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
MetricMixin
Mean Average Precision (mAP) metric
- metrics_step(preds: torch.Tensor, targets: torch.Tensor, *args, **kwargs) MetricDict [source]¶
- metrics_epoch(preds: torch.Tensor, targets: torch.Tensor, *args, **kwargs) MetricDict [source]¶
- ride.MetricSelector(mapping: Dict[str, Union[MetricMixin, Iterable[MetricMixin]]] = None, default_config: str = '', **kwargs: Union[MetricMixin, Iterable[MetricMixin]]) MetricMixin [source]¶
- ride.TopKAccuracyMetric(*Ks) MetricMixin [source]¶
- class ride.AdamWOneCycleOptimizer(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
ride.core.OptimizerMixin
Abstract base-class for Optimizer mixins
- hparams: Ellipsis¶
- parameters: Callable¶
- train_dataloader: Callable¶
- static configs() ride.core.Configs [source]¶
- class ride.AdamWOptimizer(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
ride.core.OptimizerMixin
Abstract base-class for Optimizer mixins
- hparams: Ellipsis¶
- parameters: Callable¶
- static configs() ride.core.Configs [source]¶
- class ride.SgdOneCycleOptimizer(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
ride.core.OptimizerMixin
Abstract base-class for Optimizer mixins
- hparams: Ellipsis¶
- parameters: Callable¶
- train_dataloader: Callable¶
- static configs() ride.core.Configs [source]¶
- class ride.SgdOptimizer(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]¶
Bases:
ride.core.OptimizerMixin
Abstract base-class for Optimizer mixins
- hparams: Ellipsis¶
- parameters: Callable¶
- static configs() ride.core.Configs [source]¶