BCEWithLogitsLoss — PyTorch 1.10.1 documentation
pytorch.org › docs › stableThis loss combines a Sigmoid layer and the BCELoss in one single class. This version is more numerically stable than using a plain Sigmoid followed by a BCELoss as, by combining the operations into one layer, we take advantage of the log-sum-exp trick for numerical stability. The unreduced (i.e. with reduction set to 'none') loss can be described as:
Sigmoid and BCELoss - PyTorch Forums
discuss.pytorch.org › t › sigmoid-and-bcelossMar 26, 2020 · m = nn.Sigmoid() loss = nn.BCELoss() #input is of size N x C = 1 x 3 input = torch.randn(3, requires_grad=True) print(input) #each element in target has to have 0 <= value < C target = torch.empty(3).random_(2) print(m(input)) print(target) output = loss(m(input), target) print(output) OUTPUT. tensor([-0.8840, 0.7303, -0.5842], requires_grad=True)
torchvision.ops.focal_loss — Torchvision 0.11.0 documentation
pytorch.org › torchvision › opsReturns: Loss tensor with the reduction option applied. """ p = torch. sigmoid (inputs) ce_loss = F. binary_cross_entropy_with_logits (inputs, targets, reduction = "none") p_t = p * targets + (1-p) * (1-targets) loss = ce_loss * ((1-p_t) ** gamma) if alpha >= 0: alpha_t = alpha * targets + (1-alpha) * (1-targets) loss = alpha_t * loss if reduction == "mean": loss = loss. mean elif reduction == "sum": loss = loss. sum return loss