--- title: Losses keywords: fastai sidebar: home_sidebar summary: "Implements custom loss functions." description: "Implements custom loss functions." nb_path: "nbs/05_losses.ipynb" ---
{% raw %}
{% endraw %} {% raw %}
{% endraw %}

Weighted Softmax Cross Entropy Loss

as described by Falk, Thorsten, et al. "U-Net: deep learning for cell counting, detection, and morphometry." Nature methods 16.1 (2019): 67-70.

  • axis for softmax calculations. Defaulted at 1 (channel dimension).
  • reduction will be used when we call Learner.get_preds
  • activation function will be applied on the raw output logits of the model when calling Learner.get_preds or Learner.predict
  • decodes function converts the output of the model to a format similar to the target (here binary masks). This is used in Learner.predict
{% raw %}
{% endraw %} {% raw %}

class WeightedSoftmaxCrossEntropy[source]

WeightedSoftmaxCrossEntropy(*args, axis=-1, reduction='mean') :: Module

Weighted Softmax Cross Entropy loss functions

{% endraw %}

In a segmentation task, we want to take the softmax over the channel dimension

{% raw %}
tst = WeightedSoftmaxCrossEntropy(axis=1)
output = TensorImage(torch.randn(4, 5, 356, 356, requires_grad=True))
targets = TensorMask(torch.ones(4, 356, 356).long())
weights = torch.randn(4, 356, 356)
loss = tst(output, targets, weights)

test_eq(tst.activation(output), F.softmax(output, dim=1))
test_eq(tst.decodes(output), output.argmax(dim=1))
{% endraw %}