How to freeze gluon weights?

I’m trying to do fine tuning of some network with some weights frozen. I read about some “net.layer.setattr('grad_req', 'null')”, how to apply that to net parameters?
In my situation, I’d like to freeze all net parameters which have “resnet” in their names
thanks