Load Amazon Sagemaker NTM model locally for inference

I have trained a Sagemaker NTM model which is a neural topic model, directly on the AWS sagemaker platform. Once training is complete you are able to download the mxnet model files. Once unpacked the files contain:

  • params
  • symbol.json
  • meta.json

I have followed the docs on mxnet to load the model and have the following code:

sym, arg_params, aux_params = mx.model.load_checkpoint('model_algo-1', 0)
module_model = mx.mod.Module(symbol=sym, label_names=None, context=mx.cpu())

module_model.bind(
    for_training=False,
    data_shapes=[('data', (1, VOCAB_SIZE))]
)

module_model.set_params(arg_params=arg_params, aux_params=aux_params, allow_missing=True) # must set allow missing true here or receive an error for a missing n_epoch var

I now try and use the model for inference using:

module_model.predict(x) # where x is a numpy array of size (1, VOCAB_SIZE)

The code runs, but the result is just a single value, where I expect a distribution over topics:

[11.060672]
<NDArray 1 @cpu(0)>

Any help would be great!