Saving and loading bert model

Hi, i have followed this to build a similar bert model,i want to know if there is a way to save this classifier and then use it in separate script to make predictions. Thanks.

Is your separate script going to be in python still?

If so then you can get saved the parameters of the trained bert model using .save_parameters().

Recreate the model architecture in your inference script and the reload the save parameters using .load_parameters()

Essentially you want something like:

bert_arch, vocabulary = nlp.model.get_model('bert_12_768_12',
                                             pretrained=False, ctx=ctx, use_pooler=True,
                                             use_decoder=False, use_classifier=False)
bert_classifier = model.classification.BERTClassifier(bert_arch, num_classes=num_classes, dropout=dropout)

in your inference code.

1 Like

Thanks , i tried this and it worked :wink:

@chidauri, please, explain how to predict after loading parameters. Thanks!

I hope it is not late enough :stuck_out_tongue:
You keep everything the same, but use these trained params instead of actually training

1 Like

Thank you very much!