torch.nn.utils.weight_norm — PyTorch 1.10.1 documentation
pytorch.org › torchWeight normalization is a reparameterization that decouples the magnitude of a weight tensor from its direction. This replaces the parameter specified by name (e.g. 'weight') with two parameters: one specifying the magnitude (e.g. 'weight_g') and one specifying the direction (e.g. 'weight_v' ). Weight normalization is implemented via a hook that recomputes the weight tensor from the magnitude and direction before every forward () call.
torch.nn.utils.spectral_norm — PyTorch 1.10.1 documentation
pytorch.org › docs › stableSpectral normalization stabilizes the training of discriminators (critics) in Generative Adversarial Networks (GANs) by rescaling the weight tensor with spectral norm σ \sigma σ of the weight matrix calculated using power iteration method. If the dimension of the weight tensor is greater than 2, it is reshaped to 2D in power iteration method to get spectral norm.