How to implement cross entropy for binary segmentation using symbol only?

Given a softmax classified feature f (Bx2xHxW). And a target label size of Bx1xHxW. I want to implement cross entropy loss using symbol only. This is my implementation

# target size of Bx1xHxW
target_squeeze = mx.symbol.squeeze(target, axis=1) #size of BxHxW
target_squeeze = mx.sym.one_hot(target_squeeze, depth = 2, on_value = -1.0, off_value = 0.0) 
# Transpose from BxHxWx2 to Bx2xHxW
 target_squeeze = mx.symbol.transpose(target_squeeze, axes=(0,3,1,2))
# Get log of feature f
f_log  = mx.sym.log(f)
batch_size =32
f_sum = mx.symbol.sum(target_squeeze * f_log)/batch_size
f_sum = mx.symbol.MakeLoss(f_sum, name = 'loss_ce')

Is my implementation correct? If not, please correct it for me. Thanks in advantage