LogSoftmax — PyTorch 1.10.1 documentation
pytorch.org › docs › stableclass torch.nn.LogSoftmax(dim=None) [source] Applies the. log ( Softmax ( x)) \log (\text {Softmax} (x)) log(Softmax(x)) function to an n-dimensional input Tensor. The LogSoftmax formulation can be simplified as: LogSoftmax ( x i) = log ( exp ( x i) ∑ j exp ( x j)) \text {LogSoftmax} (x_ {i}) = \log\left (\frac {\exp (x_i) } { \sum_j \exp (x_j)} \right) LogSoftmax(xi.
torch.nn.functional.log_softmax — PyTorch 1.10.1 documentation
pytorch.org › torchtorch.nn.functional.log_softmax(input, dim=None, _stacklevel=3, dtype=None) [source] Applies a softmax followed by a logarithm. While mathematically equivalent to log (softmax (x)), doing these two operations separately is slower and numerically unstable. This function uses an alternative formulation to compute the output and gradient correctly. See LogSoftmax for more details.