Arbitrary size layer specification

In the __init__ function for a Block/Hybrid Block how do we let the number of layers be an arbitrary specification?

If I do:

self.fcs = [nn.Dense(layer, activation='relu') for layer in self.layer_sizes]

And then in the forward/hybrid_forward:

def forward(self, x):
    for layer in self.fcs:
        x = layer(x)
    return x

List of Blocks won’t be registered automatically.
You need to call register_child on each of them

Or you can use a Sequential layer