mx.sym.SoftmaxOutput: how to assign instance-wise weight for the loss function?

I think it is a very common functionality that assigns a element-wise loss weight to
each sample in the batch.

After reading the doc (http://beta.mxnet.io/r/api/mx.symbol.SoftmaxOutput.html) of
mx.sym.SoftmaxOutput I cannot find a solution.

Actually the SoftmaxOutput seems to act as a loss function, but indeed it doesn’t output the loss values.
Then I cannot assign weights to the output of mx.sym.SoftmaxOutput.

So how can I assign sample-wise weights to the softmax loss using the symbolic API of mx.sym.SoftmaxOutput?

Hello @kaizhao,
in order to output the loss in MXNet’s symbol API, you need to define a metric for it (e.g. Cross-Entropy).
I don’t know of a way of adjusting the loss weight on each batch, without adjusting the symbol architecture on every update.
As alternative, you can adjust the learning rate on each batch individually which should have the same effect.

optimizer = mx.optimizer.NAG(momentum=0.9, wd=wd, rescale_grad=(1.0/batch_size))
model.init_optimizer(optimizer=optimizer)
for batch in train_iter:
    model.forward(batch, is_train=True)  # compute predictions
    for metric in metrics:  # update the metrics
        model.update_metric(metric, batch.label)
    optimizer.lr = <my_custom_lr>
    model.backward() # compute gradients
    model.update()  # update parameters

This however only enables batch-wise weight adjustment.
A different way of adjusting the weight on each sample is adjusting the sampling frequency.

Thanks for your reply.
However, your suggestion may not meet my requirement because in my case, the sample-wise weight
is computed online through the feature of each sample.