In the __init__
function for a Block/Hybrid Block how do we let the number of layers be an arbitrary specification?
If I do:
self.fcs = [nn.Dense(layer, activation='relu') for layer in self.layer_sizes]
And then in the forward/hybrid_forward:
def forward(self, x):
for layer in self.fcs:
x = layer(x)
return x