I tried to create a Gluon building block using mxnet.ndarray’s functions in the manner described in:
https://mxnet.incubator.apache.org/tutorials/gluon/gluon.html
All act_type’s work for mxnet.ndarray.LeakyReLU except ‘prelu’ ??
https://mxnet.incubator.apache.org/versions/0.12.0/api/python/ndarray/ndarray.html#mxnet.ndarray.LeakyReLU
It supports the following act_type:
• elu: Exponential Linear Unit. y = x > 0 ? x : slope * (exp(x)-1)
• leaky: Leaky ReLU. y = x > 0 ? x : slope * x
• prelu: Parametric ReLU. This is same as leaky except that slope is learnt during training.
• rrelu: Randomized ReLU. same as leaky but the slope is uniformly and randomly chosen from [lower_bound, upper_bound) for training, while fixed to be (lower_bound+upper_bound)/2 for inference.
I can get it to work fine for every function except ‘prelu’ where I get the following error:
Operator LeakyReLU expects 2 inputs, but got 1 instead.
The documentation does not make it clear what is the other input that is needed. Ideas?
# Trying to use ndarray LeakyReLU class of functions in Gluon # https://github.com/apache/incubator-mxnet/blob/master/src/operator/leaky_relu-inl.h class Net(gluon.Block): def __init__(self, **kwargs): super(Net, self).__init__(**kwargs) with self.name_scope(): # layers created in name_scope will inherit name space # from parent layer. self.conv1 = nn.Conv2D(channels=K1, kernel_size=3) self.conv2 = nn.Conv2D(channels=K2, kernel_size=3) self.conv3 = nn.Conv2D(channels=K3, kernel_size=3) self.conv4 = nn.Conv2D(channels=K4, kernel_size=3) self.fc1 = nn.Dense(num_fc) self.fc2 = nn.Dense(num_outputs) def forward(self, x): x = F.LeakyReLU(self.conv1(x), act_type='elu') x = F.LeakyReLU(self.conv2(x), act_type='leaky') x = F.LeakyReLU(self.conv3(x), act_type='rrelu') # if I change the 'prelu' on the next line to 'elu', 'leaky', or 'rrelu' it works x = F.LeakyReLU(self.conv4(x), act_type='prelu') x = x.reshape((0, -1)) x = F.LeakyReLU(self.fc1(x)) x = self.fc2(x) return x net = Net()
Here is the full stack trace that I get back:
[08:07:54] /Users/travis/build/dmlc/mxnet-distro/mxnet-build/dmlc-core/include/dmlc/logging.h:308: [08:07:54] src/c_api/c_api_ndarray.cc:76: Check failed: num_inputs == infered_num_inputs (1 vs. 2) Operator LeakyReLU expects 2 inputs, but got 1 instead. Stack trace returned 5 entries: [bt] (0) 0 libmxnet.so 0x0000000109c958d8 _ZN4dmlc15LogMessageFatalD2Ev + 40 [bt] (1) 1 libmxnet.so 0x000000010aae7d2a _Z13SetNumOutputsPKN4nnvm2OpERKNS_9NodeAttrsERKiPiS8_ + 730 [bt] (2) 2 libmxnet.so 0x000000010aae8658 _Z22MXImperativeInvokeImplPviPS_PiPS0_iPPKcS5_ + 232 [bt] (3) 3 libmxnet.so 0x000000010aae8ba4 MXImperativeInvokeEx + 164 [bt] (4) 4 _ctypes.cpython-36m-darwin.so 0x00000001092e549f ffi_call_unix64 + 79 Traceback (most recent call last): File "gluon-mnist-v6.py", line 195, in output = net(data) File "/Users/bc/mxnet/lib/python3.6/site-packages/mxnet/gluon/block.py", line 290, in __call__ return self.forward(*args) File "gluon-mnist-v6.py", line 106, in forward x = F.LeakyReLU(self.conv4(x), act_type='prelu') File "", line 61, in LeakyReLU File "/Users/bc/mxnet/lib/python3.6/site-packages/mxnet/_ctypes/ndarray.py", line 92, in _imperative_invoke ctypes.byref(out_stypes))) File "/Users/bc/mxnet/lib/python3.6/site-packages/mxnet/base.py", line 146, in check_call raise MXNetError(py_str(_LIB.MXGetLastError())) mxnet.base.MXNetError: [08:07:54] src/c_api/c_api_ndarray.cc:76: Check failed: num_inputs == infered_num_inputs (1 vs. 2) Operator LeakyReLU expects 2 inputs, but got 1 instead. Stack trace returned 5 entries: [bt] (0) 0 libmxnet.so 0x0000000109c958d8 _ZN4dmlc15LogMessageFatalD2Ev + 40 [bt] (1) 1 libmxnet.so 0x000000010aae7d2a _Z13SetNumOutputsPKN4nnvm2OpERKNS_9NodeAttrsERKiPiS8_ + 730 [bt] (2) 2 libmxnet.so 0x000000010aae8658 _Z22MXImperativeInvokeImplPviPS_PiPS0_iPPKcS5_ + 232 [bt] (3) 3 libmxnet.so 0x000000010aae8ba4 MXImperativeInvokeEx + 164 [bt] (4) 4 _ctypes.cpython-36m-darwin.so 0x00000001092e549f ffi_call_unix64 + 79