Custom loss function from a pre-trained network


I’m trying to make an optimiser based on an output of an existing trained network. So far I’ve not found any hints how to implement a loss function that doesn’t rely on basic operations.

How could use a network as a loss function? Any hints where to dig to?

Hi, if you are using gluon this is trivial. gluon.loss.Loss is derived from HybridBlock so it is exactly as if you are defining a custom layer (see also this):

from mxnet.gluon.model_zoo import vision
from mxnet import gluon 
from mxnet.gluon.loss import Loss

class CustomLoss(Loss):
    def __init__(self,  **kwards):
        Loss.__init__(self, **kwards)

        with self.name_scope():
             # Here you define/load your pretrained network
             #self.pretrained_net = (...) # Here you define/load your pretrained network
             # This is just an example from model zoo: 
             # initialize the network with fixed layers (non trainable). 
             self.pretrained_net = vision.resnet18_v1(pretrained=True) # Assuming you want the whole network

    def hybrid_forward(self,F, _predictions, _labels):
        return self.pretrained_net(_predictions)  # assuming it can be fed the dimensionality of "predictions"
       # I don't know how you can combine predictions and labels from a pre-trained network 

then you use it with:

myLoss = CustomLoss() # you may want to add specific params in your constructor
preds,labels = ... 
loss = myLoss(preds,labels) # it can be ascalar/vector/matrix, so you post process it the way you want.

I am using this formalism to derive my custom Jaccard index loss etc. Hope this helps.

1 Like

Wow, that’s a quick response. Thanks, I’ll give it a try.

1 Like