BCEWithLogitsLoss — PyTorch 1.10.1 documentation
pytorch.org › docs › stableℓ ( x, y) = L = { l 1, …, l N } ⊤, l n = − w n [ y n ⋅ log σ ( x n) + ( 1 − y n) ⋅ log ( 1 − σ ( x n))], \ell (x, y) = L = \ {l_1,\dots,l_N\}^\top, \quad l_n = - w_n \left [ y_n \cdot \log \sigma (x_n) + (1 - y_n) \cdot \log (1 - \sigma (x_n)) \right], ℓ(x,y) = L = {l1. . ,…,lN. . }⊤, ln. . = −wn. .
Python Examples of torch.nn.functional.binary_cross_entropy ...
www.programcreek.com › python › exampledef binary_cross_entropy(pred, label, weight=None, reduction='mean', avg_factor=None): if pred.dim() != label.dim(): label, weight = _expand_binary_labels(label, weight, pred.size(-1)) # weighted element-wise losses if weight is not None: weight = weight.float() loss = F.binary_cross_entropy_with_logits( pred, label.float(), weight, reduction='none') # do the reduction for the weighted loss loss = weight_reduce_loss(loss, reduction=reduction, avg_factor=avg_factor) return loss