Failed to convert symbol for mixed precision inference

I am trying to do float16 inference with a symbolic model trained with float32. I had success with gluon models, but not with symbols.

According to the docs I add a float16 data symbol:

sym, arg_params, aux_params = mx.model.load_checkpoint(param_path, 0)
data = mx.sym.Variable(name="data")
data = mx.sym.Cast(data=data, dtype='float16')
sym = sym(data=data)

Then I try to bind the symbol which results into an error:

self.mod = mx.mod.Module(symbol=sym, context=ctx, label_names=None)
self.mod.bind(for_training=False, data_shapes=[('data', (1, 3, 512, 512))], label_shapes=self.mod._label_shapes)

Error in operator relu4_3_anchors: [23:58:11] include/mxnet/operator.h:228: Check failed: in_type->at(i) == mshadow::default_type_flag || in_type->at(i) == -1 Unsupported data type 2

Any idea how to convert the symbol properly?

Do you have to use symbol API or can you use SymbolBlock in Gluon with the saved symbolic model?

I guess I could use Gluon SymbolBlock. My hypothesis is currently that one of the contrib operator in this model is not supporting float16.

Possible. Did you try SymbolBlock? Do you get the same thing?

Yeah unfortunately. I will stick to float32 for the time being then.