Dear all,
I’m trying to use a pretrained VGG in my model. Here a minimal example
num_devices = mx.context.num_gpus()
printinfo(f"GPUs found: {num_devices}")ctx = [mx.gpu(i) for i in range(num_devices)]
batch_size = config.BATCH_SIZE * num_devices
network = vision.get_model(name = “vgg16”, pretrained = True)
with network.name_scope():
network.output = gluon.nn.Dense(units = config.NUM_CLASSES)
the problem is when I try to configure the gluon.Trainer:
opt = “sgd”
opt_params = {“learning_rate”: 0.01, “wd”: 0.0001, “momentum”: 0.9}trainer = gluon.Trainer(params = network.collect_params().initialize(ctx = ctx),
optimizer = opt,
optimizer_params = opt_params)
Then I get the following error:
Traceback (most recent call last):
File “fine_tune.py”, line 72, in
trainer = gluon.Trainer(params = network.collect_params().initialize(ctx = ctx),
File “/opt/anaconda/anaconda3/envs/dl/lib/python3.8/site-packages/mxnet/gluon/trainer.py”, line 81, in init
raise ValueError(
ValueError: First argument must be a list or dict of Parameters, got <class ‘NoneType’>.
Any idea, how to solve this issue?
Thanks