Python Examples of torch.nn.CrossEntropyLoss - ProgramCreek.com
www.programcreek.com › torchdef cross_entropy_loss_weighted(output, labels): temp = labels.data.cpu().numpy() freqCount = scipystats.itemfreq(temp) total = freqCount[0][1]+freqCount[1][1] perc_1 = freqCount[1][1]/total perc_0 = freqCount[0][1]/total weight_array = [perc_1, perc_0] if torch.cuda.is_available(): weight_tensor = torch.FloatTensor(weight_array).cuda() else: weight_tensor = torch.FloatTensor(weight_array) ce_loss = nn.CrossEntropyLoss(weight=weight_tensor) images, channels, height, width = output.data.shape ...
Example CrossEntropyLoss for 3D semantic segmentation in pytorch
https://stackoverflow.com/questions/4771569608.12.2017 · here is the extract of my code: import torch import torch.nn as nn from torch.autograd import Variable criterion = torch.nn.CrossEntropyLoss () images = Variable (torch.randn (1, 12, 60, 36, 60)).cuda () labels = Variable (torch.zeros (1, 12, 60, 36, 60).random_ (2)).long ().cuda () loss = criterion (images.view (1,-1), labels.view (1,-1))
CrossEntropyLoss — PyTorch 1.11.0 documentation
pytorch.org › torchclass torch.nn.CrossEntropyLoss(weight=None, size_average=None, ignore_index=- 100, reduce=None, reduction='mean', label_smoothing=0.0) [source] This criterion computes the cross entropy loss between input and target. It is useful when training a classification problem with C classes. If provided, the optional argument weight should be a 1D ...