How can I get the output of the net's last layer- symbol.LinearRegressionOutput?

Hi,
I tried to get the net loss by add the loss into mx.sym.Group() like this:

density = self.get_Net(data)
Euclidean_loss = mx.sym.LinearRegressionOutput (data = density,label = label,grad_scale = 1/(2.*BATCH_SIZE),name = 'Euclidean_loss')
group = mx.sym.Group([mx.sym.BlockGrad(data = density), Euclidean_loss])

When training the net, I print the ''texec.outputs ‘’ . Then I found that the loss is same as density. I don’t know why ? Can anybody help me? I’m very appreciated it.

Also I tried to make my own loss by using the mx.sym.Makeloss

Euclidean_loss = mx.sym.MakeLoss(mx.sym.square(density - label).sum()/(2.*BATCH_SIZE))

When I training the net again, I got that the prediction is zero? I’m very appreciated it if you can help me.

The API doc for LinearRegressionOutput states that “Just outputs data during forward propagation”. This means that the loss is only calculated in backward() operation. If you want to be able to print the actual loss value, you’d have to write your own loss.

I don’t quite understand what you mean by “prediction is zero” when you use your custom loss. Also I highly recommend that you switch to Gluon as it makes problems like these a thing of the past :slight_smile: