Hi @phschmid,
indeed I verify that If you try to encapsulate your network in a HybridSequential
it gives the error you mention. I recommend to avoid using HybridSequential, and just write up another HybridBlock where you’ll put inside there everything you want. I hardly ever use HybridSequential, unless I want to add more layers which are standard. For the sake of example, say you want to stack batch norm and conv2D layers, I would write my code like this (assuming you’ve already defined the ConcatLayer
previously):
class CustomNet(HybridBlock):
def __init__(self, depth, **kwargs):
HybridBlock.__init__(self,**kwargs)
with self.name_scope():
# Declare here your initial ConcatLayer
self.concat = ConcatLayer()
# Stack here any more layers you want to have after (or before?) your ConcatLayer
temp = gluon.nn.HybridSequential() # A temporary variable
# That we add a bunch of layers, repeating in total depth times.
for _ in range(depth):
temp.add(gluon.nn.Conv2D(.....,use_bias=False)) # Add some arguments on Conv2D
temp.add(gluon.nn.BatchNorm())
# Pass the temporary variable to a class member.
self.main = temp
def hybrid_forward(self,F,input1,input2):
# do your thing with multiple layers
out = self.concat(input1,input2)
# then pass the output (single layer) to a HybridSequential standard stack of layers
out = self.main(out)
return out
I haven’t tested the code, but you get the idea, should be easy to make it work if there is a minor bug. Look also at these topics, for more custom blocks.
a, b, c, d.
Please also note, that HybridSequential and Sequential behave also as containers (you can access the layers you put inside them with list indexing, see this). Very helpful if you want to write loops inside your hybrid_forward
call.
If you want post the network you want to have, and people here can help.
Cheers,
Foivos