How to use Soft-label for Cross-Entropy loss? - PyTorch Forums
https://discuss.pytorch.org/t/how-to-use-soft-label-for-cross-entropy-loss/7284411.03.2020 · softmax_cross_entropy_with_logits TF supports not needing to have hard labels for cross entropy loss: logits = [[4.0, 2.0, 1.0], [0.0, 5.0, 1.0]] labels = [[1.0, 0.0, 0.0], [0.0, 0.8, 0.2]] tf.nn.softmax_cross_entropy_with_logits(labels=labels, logits=logits) Can we do the same thing in Pytorch?. What kind of Softmax should I use ? nn.Softmax() or nn.LogSoftmax()?
CrossEntropyLoss — PyTorch 1.10.1 documentation
pytorch.org › torchCrossEntropyLoss — PyTorch 1.10.0 documentation CrossEntropyLoss class torch.nn.CrossEntropyLoss(weight=None, size_average=None, ignore_index=- 100, reduce=None, reduction='mean', label_smoothing=0.0) [source] This criterion computes the cross entropy loss between input and target. It is useful when training a classification problem with C classes.