Parameter initialization scheme that keeps variance constant across layers to stabilize training of deep networks, also called Glorot initialization. ← Fair Launch