Cross-entropy with one-hot targets - PyTorch Forums
https://discuss.pytorch.org/t/cross-entropy-with-one-hot-targets/1358012.02.2018 · Cross-entropy with one-hot targets Dawid_S(Dawid S) February 12, 2018, 10:29pm #1 I’d like to use the cross-entropy loss function that can take one-hot encoded values as the target. # Fake NN output out = torch.FloatTensor([[0.05, 0.9, 0.05], [0.05, 0.05, 0.9], [0.9, 0.05, 0.05]]) out = torch.autograd.Variable(out)