CrossEntropyLoss — PyTorch 1.11.0 documentation
pytorch.org › torchCrossEntropyLoss. class torch.nn.CrossEntropyLoss(weight=None, size_average=None, ignore_index=- 100, reduce=None, reduction='mean', label_smoothing=0.0) [source] This criterion computes the cross entropy loss between input and target. It is useful when training a classification problem with C classes. If provided, the optional argument weight ...
MSELoss — PyTorch 1.11.0 documentation
pytorch.org › docs › stableBy default, the losses are averaged over each loss element in the batch. Note that for some losses, there are multiple elements per sample. If the field size_average is set to False, the losses are instead summed for each minibatch. Ignored when reduce is False. Default: True reduce ( bool, optional) – Deprecated (see reduction ).
BCELoss — PyTorch 1.11.0 documentation
pytorch.org › docs › stableis either 0 or 1, one of the log terms would be mathematically undefined in the above loss equation. PyTorch chooses to set \log (0) = -\infty log(0) = −∞, since \lim_ {x\to 0} \log (x) = -\infty limx→0 log(x) = −∞ . However, an infinite term in the loss equation is not desirable for several reasons. For one, if either y_n = 0 yn = 0 or