Cosine loss error Cannot differentiate node because it is not in a computational graph

This is my model function.
def mx_model_output_grad(mx_model, x, labels_batch):
x = nd.array(x.cpu().detach().numpy(), ctx=mx.gpu())
labels_batch = nd.array(labels_batch.detach().numpy())
x_ = mx.io.DataBatch(data=(x,), label=(labels_batch,))
mx_model.forward(x_, is_train=False)
x_feature = mx_model.get_outputs()
return x, x_feature

This is my loss
consin_loss = gluon.loss.CosineEmbeddingLoss()

    adv_x_nd, adv_feature = mx_model_output_grad(mx_model,adv_x,labels_batch)
    adv_x_nd.attach_grad()
    #adv_feature.attach_grad()
    temp = nd.ones((1)).as_in_context(mx.gpu())
    temp_label = -nd.ones((1)).as_in_context(mx.gpu())
    adv_feature = adv_feature[0].as_in_context(mx.gpu())
    
    with autograd.record():
        Loss = -consin_loss(target_feature[0], adv_feature[0], temp_label, temp)
    #np.dot(target_feature,adv_feature)/(np.linalg.norm(target_feature)*(np.linalg.norm(adv_feature)))
    #adv_x_nd = nd.array(adv_x.cpu().detach().numpy())
    #adv_x_nd.attach_grad()
    #Loss = -adv_out[0][target]
    Loss.backward(retain_graph=True)

But, I get this error.

mxnet.base.MXNetError: [07:33:57] src/imperative/imperative.cc:285: Check failed: !AGInfo::IsNone(*i) Cannot differentiate node because it is not in a computational graph. You need to set is_recording to true or use autograd.record() to save computational graphs for backward. If you want to differentiate the same graph twice, you need to pass retain_graph=True to backward.

image

Plz provide full code if possible

As @mouryarishik mentioned, it would be really helpful to have a minimal reproducible example to identify the source of a problem. But from your exception message I can see that most probably your network forward pass happens outside of autograd. Make sure when you train your model to have both loss and model to be computed inside of your autograd.record scope. It will look similar to the code below:

with autograd.record():
    o = net(x)
    loss = loss_fn(o, label)

loss.backward()
trainer.step(x.shape[0])
1 Like