How to initialize a BatchNorm with existing weights?

Hey there,

I am attempting to create a BatchNorm with weights (gamma,beta,moving_mean,moving_var) from a training run.

I’m essentially doing this:

Read in gamma, read in beta, read in movingMean, and movingVar. Create symbols for them and NDArrays.

Then I:
double eps = 0.001;
mx_float momentum = 0.9; // should be used
bool fix_gamma = false;
bool use_global_stats = false;
bool output_mean_var = false;
int axis = 1;
bool cudnn_off = false;

mx::Symbol layer = mx::BatchNorm(
	name,
	previous,
	gammaSymbol,
	betaSymbol,
	movingMeanSymbol,
	movingVarianceSymbol,
	eps,
	momentum,
	fix_gamma,
	use_global_stats,
	output_mean_var,
	axis,
	cudnn_off
);

But the layer seems to output zeros. Which means, I think, that I need to do something else. Keras doesn’t use the mxnet BatchNorm for prediction (it uses a function is generates). Should I do this? Is it possible to use the BatchNorm layer for prediction only?

Thanks for any hints,

Tim