base_model

Base model implementing helper methods.

Module Contents

class base_model.BaseModel(hparams)

Bases: pytorch_lightning.core.LightningModule

Inheritance diagram of base_model.BaseModel

The primary class containing all the training functionality. It is equivalent toPyTorch nn.Module in all aspects.

Parameters

LightningModule (nn.Module) – The Pytorch-Lightning module derived from nn.module withuseful hooks

Raises

NotImplementedError – Some methods must be overridden

abstract forward(self)

Dummy method to do forward pass on the model.

Raises

NotImplementedError – The method must be overridden in the derived models

training_step(self, batch, batch_idx)

Called inside the testing loop with the data from the testing dataloader passed in as batch. The implementation is delegated to the dataloader instead.

For performance critical usecase prefer monkey-patching instead.

Parameters
  • model (Model) – The chosen model

  • batch (int) – Batch of input and ground truth variables

Returns

Loss and logs

Return type

dict

validation_step(self, batch, batch_idx)

Called inside the validation loop with the data from the validation dataloader passed in as batch. The implementation is delegated to the dataloader instead.

For performance critical usecase prefer monkey-patching instead.

Parameters
  • model (Model) – The chosen model

  • batch (int) – Batch of input and ground truth variables

Returns

Loss and logs

Return type

dict

test_step(self, batch, batch_idx)

Called inside the testing loop with the data from the testing dataloader passed in as batch. The implementation is delegated to the dataloader instead.

For performance critical usecase prefer monkey-patching instead.

Parameters
  • model (Model) – The chosen model

  • batch (int) – Batch of input and ground truth variables

Returns

Loss and logs

Return type

dict

training_epoch_end(self, outputs)

Called at the end of training epoch to aggregate outputs.

Parameters

outputs (list) – List of individual outputs of each training step.

Returns

Loss and logs.

Return type

dict

validation_epoch_end(self, outputs)

Called at the end of validation epoch to aggregate outputs.

Parameters

outputs (list) – List of individual outputs of each validation step.

Returns

Loss and logs.

Return type

dict

test_epoch_end(self, outputs)

Called at the end of testing epoch to aggregate outputs.

Parameters

outputs (list) – List of individual outputs of each testing step.

Returns

Loss and logs.

Return type

dict

configure_optimizers(self)

Decide optimizers and learning rate schedulers.

At least one optimizer is required.

Returns

Optimizer and the schedular

Return type

tuple

add_bias(self, bias)

Initialize bias parameter of the last layer with the output variable’s mean.

Parameters

bias (float) – Mean of the output variable.

prepare_data(self, ModelDataset=None, force=False)

Load and split the data for training and test during the first call. Behavior on second call determined by the force parameter.

Parameters
  • ModelDataset (class, optional) – The dataset class to be used with the model, defaults to None

  • force (bool, optional) – Force the data preperation even if already prepared, defaults to False

train_dataloader(self)

Create the training dataloader from the training dataset.

Returns

The training dataloader

Return type

Dataloader

val_dataloader(self)

Create the validation dataloader from the validation dataset.

Returns

The validation dataloader

Return type

Dataloader

test_dataloader(self)

Create the testing dataloader from the testing dataset.

Returns

The testing dataloader

Return type

Dataloader