Applying cross entropy loss on one-hot targets - PyTorch Forums
discuss.pytorch.org › t › applying-cross-entropyJun 30, 2020 · Hi, I have labels in one-hot format with size [bsz, bsz2]. My input also is a matrix of shape [bsz,bsz2]. I want to use cross-entropy loss. I searched the pytorch doc and I found that we can’t apply cross-entropy loss on one hot except in the following way: out = torch.FloatTensor([[0.05, 0.9, 0.05], [0.05, 0.05, 0.9], [0.9, 0.05, 0.05]]) y1 = torch.FloatTensor([[0, 1, 0], [0, 0, 1], [1, 0 ...
CrossEntropyLoss and OneHot classes - PyTorch Forums
discuss.pytorch.org › t › crossentropyloss-andOct 20, 2021 · I’m having some trouble understanding CrossEntropyLoss as it relates to one_hot encoded classes. The docs use random numbers for the values, so to better understand I created a set of values and targets which I expect to show zero loss… I have 5 classes, and 5 one_hot encoded vectors (1 for each class), I then provide a target index corresponding to each class. I’m using reduction ...
torch.nn.functional.one_hot — PyTorch 1.10.1 documentation
pytorch.org › torchtorch.nn.functional.one_hot¶ torch.nn.functional. one_hot (tensor, num_classes =-1) → LongTensor ¶ Takes LongTensor with index values of shape (*) and returns a tensor of shape (*, num_classes) that have zeros everywhere except where the index of last dimension matches the corresponding value of the input tensor, in which case it will be 1.