Uniform initialize


I apply initialize with init=mx.initializer.Uniform() and different scale paramenter like this:

random.uniform(-self.scale, self.scale, out=arr)

But I always get the same conv weights after initialize.


I’m not sure I understand your question. Do you mean that you’re trying to compare mx.initializer.Uniform() with random.Uniform() on python? Or that the initial values for your conv weights produced by mx.initializer.Uniform() are not actually random.

Also any particular reason why you’re using uniform initialization for a convolutional network?