R Loss Function

Loss

This tutorial provides guidelines for using customized loss function in network construction.

Loss functions can be specified either using the name of a built in loss function (e.g. 'loss = binarycrossentropy'), a reference to a built in loss function (e.g. 'loss = lossbinarycrossentropy') or by passing an artitrary function that returns a scalar for each data-point and takes the following two arguments. Loss functions can be specified either using the name of a built in loss function (e.g. 'loss = binarycrossentropy'), a reference to a built in loss function (e.g. 'loss = lossbinarycrossentropy') or by passing an artitrary function that returns a scalar for each data-point and takes the following two arguments. I am currently have a problem when joining the loss function of multi task learning. I try to do both segmentation and classification in one U-Net based model. I add two loss function The Dice loss( for segmentation) and the Cross Entropy Loss( for classifications).

Loss

Model Training Example¶

Let’s begin with a small regression example. We can build and train a regression model with the following code:

Besides the LinearRegressionOutput, we also provide LogisticRegressionOutput and MAERegressionOutput. However, this might not be enough for real-world models. You can provide your own loss function by using mx.symbol.MakeLoss when constructing the network.

R Loss FunctionR loss function excel

How to Use Your Own Loss Function¶

We still use our previous example, but this time we use mx.symbol.MakeLoss to minimize the (pred-label)^2

Then we can train the network just as usual.

We should get very similar results because we are actually minimizing the same loss function. However, the result is quite different.

R Loss Functions

This is because output of mx.symbol.MakeLoss is the gradient of loss with respect to the input data. We can get the real prediction as below.

We have provided many operations on the symbols. An example of pred-label can be found below.

R Log Loss Function

Next Steps¶