python - How to choose cross-entropy loss in TensorFlow ...
stackoverflow.com › questions › 47034888In tensorflow, there are at least a dozen of different cross-entropy loss functions: tf.losses.softmax_cross_entropy; tf.losses.sparse_softmax_cross_entropy; tf.losses.sigmoid_cross_entropy; tf.contrib.losses.softmax_cross_entropy; tf.contrib.losses.sigmoid_cross_entropy; tf.nn.softmax_cross_entropy_with_logits; tf.nn.sigmoid_cross_entropy_with_logits... Which one works only for binary classification and which are suitable for multi-class problems?
GitHub - AliAbbasi/Numerically-Stable-Cross-Entropy-Loss ...
github.com › AliAbbasi › Numerically-Stable-CrossMay 08, 2017 · Based on Tensorflow document in here without using the 'softmax_cross_entropy_with_logits()' function for calculating loss in Tensorflow, we face the problem of numerically unstable results, actually happen in large numbers, this problem arises when the logits from the network output are large numbers, so python returns 'inf' in result, consider our network has 3 output, and they are large numbers such: [1000, 2000, 2500], now we should sqush this logits with Softmax function to have ...