Typically one would implement a custom op is if the operation cannot be performed using `ndarray`

operators, in which case both forward and backward equations must be hand-implemented.

If your forward equation can be implemented using `ndarray`

operators, there is no need to implement custom operators. Technically your equations can simply be under the `autograd`

scope:

```
with autograd.record():
out = x * y + k
```

Alternatively if you are composing a neural network that has several layers and you simply want one of the layers to be custom, then what you want is to implement a custom Gluon block, not a custom operator:

```
class MyBlock(gluon.HybridBlock):
def __init__():
super(LSTMCell, self).__init__()
self.k = self.params.get('k', (100, 0)) # k is a learnable bias in this case
def hybrid_forward(x, y, k):
out = x * y + k
```