torch.nn.functional.log_softmax — PyTorch 1.10.1 documentation
pytorch.org › torchtorch.nn.functional.log_softmax(input, dim=None, _stacklevel=3, dtype=None) [source] Applies a softmax followed by a logarithm. While mathematically equivalent to log (softmax (x)), doing these two operations separately is slower and numerically unstable. This function uses an alternative formulation to compute the output and gradient correctly. See LogSoftmax for more details.