Mxnet symbol how to set Loss(total) = 2*Loss_x + 1*Loss_y

mxnet symbol how to set Loss(total) = 2Loss_x + 1Loss_y
Any advice will be appreciated, thanks

You can use mx.sym.MakeLoss https://mxnet.incubator.apache.org/versions/master/api/python/symbol/symbol.html#mxnet.symbol.MakeLoss

loss_total = 2*loss_x + loss_y
loss = mx.sym.MakeLoss(loss_total)

Thanks for your reply.
what is the difference between “mx.sym.Group(2loss_x, loss_y)" and "loss_total = 2loss_x + loss_y
loss = mx.sym.MakeLoss(loss_total)”?
Any advice will be appreciated, thanks

Yes you can in fact also use “mx.sym.Group” in order to group multiple loss layers together.

mx.sym.Group(2*loss_x, loss_y) is not similar to 2*loss_x + loss_y
In the second case you are adding 2 symbols, while in case of .Group you are basically making a symbolic list of elements provided as arguments.
So .Group(2*loss_x, loss_y) = [2*loss_x, loss_y]
And other one is simply 2*loss_x + loss_y

Thanks for your reply. For the backward, the gradients are the same for [2loss_x, loss_y] and 2loss_x + loss_y? I think gradients are the same for these two kinds of losses. But I am not sure about that. Thanks

Thanks for your reply. For the backward, are the gradients the same for [2loss_x, loss_y] and 2loss_x + loss_y? Thanks

According to Custom Loss + L2 Regularization and https://github.com/apache/incubator-mxnet/issues/2677 using mx.sym.Group will add the objectives during backpropagation. So in your case (adding two loss values) you can either use mx.sym.MakeLoss or mx.sym.Group.

Thanks @NRauschmayr for clarifying my misunderstanding, and sorry to @hdjsjyl for providing wrong explanation.

Thanks for your anser. It is very clear for me.

No problem, Thanks for your reply.