Understanding Categorical Cross-Entropy Loss, Binary Cross ...
gombru.github.io › 2018/05/23 › cross_entropy_lossMay 23, 2018 · def forward (self, bottom, top): labels = bottom [1]. data scores = bottom [0]. data scores = 1 / (1 + np. exp (-scores)) # Compute sigmoid activations logprobs = np. zeros ([bottom [0]. num, 1]) # Compute cross-entropy loss for r in range (bottom [0]. num): # For each element in the batch for c in range (len (labels [r,:])): # For each class we compute the binary cross-entropy loss # We sum the loss per class for each element of the batch if labels [r, c] == 0: # Loss form for negative ...