--- title: utils keywords: fastai sidebar: home_sidebar summary: "Various utility classes and functions used by the `BLURR` library." description: "Various utility classes and functions used by the `BLURR` library." nb_path: "nbs/00_utils.ipynb" ---
{% raw %}
{% endraw %} {% raw %}
{% endraw %} {% raw %}
{% endraw %}

Utility classes

{% raw %}

class Singleton[source]

Singleton()

{% endraw %} {% raw %}
{% endraw %}

Singleton functions as python decorator. Use this above any class to turn that class into a singleton (see here for more info on the singleton pattern).

{% raw %}
@Singleton
class TestSingleton:
    pass


a = TestSingleton()
b = TestSingleton()
test_eq(a, b)
{% endraw %}

Utility methods

{% raw %}
{% endraw %} {% raw %}

str_to_type[source]

str_to_type(typename:str)

Converts a type represented as a string to the actual class

Type Default Details
typename str No Content
{% endraw %}

How to use:

{% raw %}
print(str_to_type("test_eq"))
print(str_to_type("TestSingleton"))
<function test_eq at 0x7fc972f39dc0>
<__main__.Singleton object at 0x7fc92fe3cfa0>
{% endraw %} {% raw %}
{% endraw %} {% raw %}

print_versions(packages:Union[str, typing.List[str]])

Prints the name and version of one or more packages in your environment

Type Default Details
packages typing.Union[str, typing.List[str]] A string of space delimited package names or a list of package names
{% endraw %}

How to use:

{% raw %}
print_versions("torch transformers fastai")
print("---")
print_versions(["torch", "transformers", "fastai"])
torch: 1.10.1+cu111
transformers: 4.16.2
fastai: 2.5.6
---
torch: 1.10.1+cu111
transformers: 4.16.2
fastai: 2.5.6
{% endraw %} {% raw %}
{% endraw %} {% raw %}

set_seed[source]

set_seed(s, reproducible=False)

Set random seed for random, torch, and numpy (where available)

{% endraw %}

Loss functions

{% raw %}

class PreCalculatedLoss[source]

PreCalculatedLoss(loss_cls, *args, axis=-1, flatten=True, floatify=False, is_2d=True, **kwargs) :: BaseLoss

If you want to let your Hugging Face model calculate the loss for you, make sure you include the labels argument in your inputs and use PreCalculatedLoss as your loss function. Even though we don't really need a loss function per se, we have to provide a custom loss class/function for fastai to function properly (e.g. one with a decodes and activation methods). Why? Because these methods will get called in methods like show_results to get the actual predictions.

Note: The Hugging Face models will always calculate the loss for you if you pass a labels dictionary along with your other inputs (so only include it if that is what you intend to happen)

{% endraw %} {% raw %}

class PreCalculatedCrossEntropyLoss[source]

PreCalculatedCrossEntropyLoss(*args, axis=-1, weight=None, ignore_index=-100, reduction='mean', flatten=True, floatify=False, is_2d=True) :: PreCalculatedLoss

If you want to let your Hugging Face model calculate the loss for you, make sure you include the labels argument in your inputs and use PreCalculatedLoss as your loss function. Even though we don't really need a loss function per se, we have to provide a custom loss class/function for fastai to function properly (e.g. one with a decodes and activation methods). Why? Because these methods will get called in methods like show_results to get the actual predictions.

Note: The Hugging Face models will always calculate the loss for you if you pass a labels dictionary along with your other inputs (so only include it if that is what you intend to happen)

{% endraw %} {% raw %}

class PreCalculatedBCELoss[source]

PreCalculatedBCELoss(*args, axis=-1, floatify=True, thresh=0.5, weight=None, reduction='mean', pos_weight=None, flatten=True, is_2d=True) :: PreCalculatedLoss

If you want to let your Hugging Face model calculate the loss for you, make sure you include the labels argument in your inputs and use PreCalculatedLoss as your loss function. Even though we don't really need a loss function per se, we have to provide a custom loss class/function for fastai to function properly (e.g. one with a decodes and activation methods). Why? Because these methods will get called in methods like show_results to get the actual predictions.

Note: The Hugging Face models will always calculate the loss for you if you pass a labels dictionary along with your other inputs (so only include it if that is what you intend to happen)

{% endraw %} {% raw %}

class PreCalculatedMSELoss[source]

PreCalculatedMSELoss(*args, axis=-1, floatify=True, **kwargs) :: PreCalculatedLoss

If you want to let your Hugging Face model calculate the loss for you, make sure you include the labels argument in your inputs and use PreCalculatedLoss as your loss function. Even though we don't really need a loss function per se, we have to provide a custom loss class/function for fastai to function properly (e.g. one with a decodes and activation methods). Why? Because these methods will get called in methods like show_results to get the actual predictions.

Note: The Hugging Face models will always calculate the loss for you if you pass a labels dictionary along with your other inputs (so only include it if that is what you intend to happen)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

class MultiTargetLoss[source]

MultiTargetLoss(loss_classes:List[typing.Callable]=[<class 'fastai.losses.CrossEntropyLossFlat'>, <class 'fastai.losses.CrossEntropyLossFlat'>], loss_classes_kwargs:List[dict]=[{}, {}], weights:Union[typing.List[float], typing.List[int]]=[1, 1], reduction:str='mean') :: Module

Provides the ability to apply different loss functions to multi-modal targets/predictions.

This new loss function can be used in many other multi-modal architectures, with any mix of loss functions. For example, this can be ammended to include the is_impossible task, as well as the start/end token tasks in the SQUAD v2 dataset (or in any extractive question/answering task)

Type Default Details
loss_classes typing.List[typing.Callable] (CrossEntropyLossFlat, CrossEntropyLossFlat) The loss function for each target
loss_classes_kwargs typing.List[dict] ({}, {}) Any kwargs you want to pass to the loss functions above
weights typing.Union[typing.List[float], typing.List[int]] (1, 1) The weights you want to apply to each loss (default: [1,1])
reduction str mean The reduction parameter of the lass function (default: 'mean')
{% endraw %} {% raw %}
{% endraw %}