Greetings,
I’m quite new to mxnet , I having scoured most NN libraries (TF has no custom backward pass and CNTK don’t seem to be able to use weight and bias for a custom layer) in order to find one that would allow me to create a custom layer. I know that using a custom gluon block would allow for a custom forward pass and for weight and bias in the layer ( as opposed to a simple custom ops).
I know that it is possible to use a backward pass ( though I am unsure about the inputs and the output of that layer, some docs on that would be nice) . I know that it’s possible(and suggested) to use autograd to take care of updating the weights but I wanted to know if autograd is sufficient to reproduce the backward pass for a convolution layer ( I know it’s fairly simply for a backward pass but I hope that such a pass is enough for my layer since it’s a variant of the convolution).
So, can I and should I use autograd or should I go for a custom backward pass, ( what would be the input and output of the backward pass).
Thanks