Weight initialization. When you instantiate a [[neural network]], you have to consider what the initial weight values are.
Example. Xavier initialization uses a normal distribution with mean 0 and variance
Example. He/Kaiming initialization is used for [[ReLU]] and uses a normal distribution with mean 0 and variance