Custom block backward pass

Greetings,

I’m quite new to mxnet , I having scoured most NN libraries (TF has no custom backward pass and CNTK don’t seem to be able to use weight and bias for a custom layer) in order to find one that would allow me to create a custom layer. I know that using a custom gluon block would allow for a custom forward pass and for weight and bias in the layer ( as opposed to a simple custom ops).

I know that it is possible to use a backward pass ( though I am unsure about the inputs and the output of that layer, some docs on that would be nice) . I know that it’s possible(and suggested) to use autograd to take care of updating the weights but I wanted to know if autograd is sufficient to reproduce the backward pass for a convolution layer ( I know it’s fairly simply for a backward pass but I hope that such a pass is enough for my layer since it’s a variant of the convolution).

So, can I and should I use autograd or should I go for a custom backward pass, ( what would be the input and output of the backward pass).

Thanks

Okay, let me have a stab at this - when I did this I simply extended autograd by using the inheriting from autograd.Function and then using that:

class square(autograd.Function):
    def forward(self, x):
        y = mx.nd.square(x)
        self.save_for_backward(y)
        return y

    def backward(self, dy):
        y, = self.saved_tensors
        return 2 * mx.nd.sqrt(y)

Maybe @piiswrong can correct me if I messed it up - but that worked for me

Interesting,

If I recall correctly that mean that I would have to register/associate that function to the custom block?

Or is that a full fledged layer (considering the forward and backward pass)?