Hi, if you are using gluon this is trivial. gluon.loss.Loss
is derived from HybridBlock
so it is exactly as if you are defining a custom layer (see also this):
from mxnet.gluon.model_zoo import vision
from mxnet import gluon
from mxnet.gluon.loss import Loss
class CustomLoss(Loss):
def __init__(self, **kwards):
Loss.__init__(self, **kwards)
with self.name_scope():
# Here you define/load your pretrained network
#self.pretrained_net = (...) # Here you define/load your pretrained network
# This is just an example from model zoo:
# initialize the network with fixed layers (non trainable).
self.pretrained_net = vision.resnet18_v1(pretrained=True) # Assuming you want the whole network
def hybrid_forward(self,F, _predictions, _labels):
return self.pretrained_net(_predictions) # assuming it can be fed the dimensionality of "predictions"
# I don't know how you can combine predictions and labels from a pre-trained network
then you use it with:
myLoss = CustomLoss() # you may want to add specific params in your constructor
preds,labels = ...
loss = myLoss(preds,labels) # it can be ascalar/vector/matrix, so you post process it the way you want.
I am using this formalism to derive my custom Jaccard index loss etc. Hope this helps.