sparse_softmax_cross_entropy_with_logits

dragon.vm.tensorflow.nn.sparse_softmax_cross_entropy_with_logits(
  labels,
  logits,
  name=None
)[source]

Compute the softmax cross entropy with sparse labels.

Examples:

labels = tf.constant([1, 0], dtype=tf.int64)
logits = tf.constant([[0.5, 0.5], [0.3, 0.7]], dtype=tf.float32)
print(tf.nn.sparse_softmax_cross_entropy_with_logits(labels, logits))  # [0.6931472, 0.9130153]
Parameters:
  • labels (dragon.Tensor) – The label tensor.
  • logits (dragon.Tensor) – The logit tensor.
  • name (str, optional) – The operation name.
Returns:

dragon.Tensor – The output tensor.