Parameter ‘_init_scale’ was not initialized on context cpu(0). It was only initialized on [gpu(1)]?

When I add the ctx in get_model, the error exists. The error is
RuntimeError: Parameter ‘ssd0_vggatrousextractor0_init_scale’ was not initialized on context cpu(0). It was only initialized on [gpu(1)].
The original SSD code is

ctx = [mx.gpu(1)]
net = get_model(net_name, pretrained_base=True)
net.initialize()
async_net.initialize()

And code I changed is

ctx = [mx.gpu(1)]
net = get_model(net_name, pretrained_base=True,ctx=ctx)
net.initialize(force_reinit=True, ctx=ctx)
async_net.initialize(force_reinit=True, ctx=ctx)

From the error ‘ssd0_vggatrousextractor0_init_scale’ ,I found the part of code about the init_scale

with self.name_scope():
# we use pre-trained weights from caffe, initial scale must change
init_scale = mx.nd.array([0.229, 0.224, 0.225]).reshape((1, 3, 1, 1)) * 255 ###!!!
self.init_scale = self.params.get_constant(‘init_scale’, init_scale) ###!!!
self.stages = nn.HybridSequential()
for l, f in zip(layers, filters):
stage = nn.HybridSequential(prefix=‘’)
with stage.name_scope():
for _ in range(l):
stage.add(nn.Conv2D(f, kernel_size=3, padding=1, **self.init))
if batch_norm:
stage.add(nn.BatchNorm())
stage.add(nn.Activation(‘relu’))
self.stages.add(stage)

What the error means is I should initialize the init_scale in CPU???
But I tried add the ctx into the init_scale, it does not work.
So, what the problem is ? And what should I do?

Could you provide a minimal reproducible example please, which I can run? Thanks.

Hi,

I have learned from my experiments with Gluon-CV that the code was written in the assumption that you create and load the parameters of your model first on the CPU - so specify ctx=None - and when all that is done you move the whole model on the GPU with:

net.collect_params().reset_ctx(ctx=[mx.gpu(1)])

hth,
Lieven