Utils¶
Cache¶
-
labml.utils.cache.
cache
(name: str, loader: Optional[Callable[], Any]] = None, *, file_type: str = 'json') → Any[source]¶ This caches results of a function. Can be used as a decorator or you can pass a lambda function to it that takes no arguments.
It doesn’t cache by arguments.
- Parameters
name (str) – name of the cache
loader (Callable[[], Any], optional) – the function that generates the data to be cached
- Keyword Arguments
file_type (str, optional) – The file type to store the data. Defaults to
json
.
Keyboard Interrupt¶
Downloading¶
PyTorch¶
-
labml.utils.pytorch.
store_model_indicators
(model: torch.nn.modules.module.Module, *, model_name: str = 'model')[source]¶ Track model parameters and gradients.
- Parameters
model (Optimizer) – PyTorch model
- Keyword Arguments
model_name (str, optional) – name of the model
-
labml.utils.pytorch.
store_optimizer_indicators
(optimizer: Optimizer, *, models: Optional[Dict[str, torch.nn.modules.module.Module]] = None, optimizer_name: str = 'optimizer')[source]¶ Track optimizer stats such as moments.
- Parameters
optimizer (Optimizer) – PyTorch optimizer
- Keyword Arguments
models (Dict[str, torch.nn.Module], optional) – a dictionary of modules being optimized. This is used to get the proper parameter names.
optimizer_name (str, optional) – name of the optimizer
-
labml.utils.pytorch.
get_modules
(configs: labml.configs.BaseConfigs)[source]¶ Get all the PyTorch modules in
configs
object.- Parameters
configs (labml.configs.BaseConfigs) – configurations object