softmax_cross_entropy_with_logits¶
dragon.vm.tensorflow.nn.
softmax_cross_entropy_with_logits
(
labels,
logits,
name=None
)[source]¶Compute the loss of softmax cross entropy.
Examples:
labels = tf.constant([[0., 1., ], [1., 0.]], dtype=tf.float32) logits = tf.constant([[0.5, 0.5], [0.3, 0.7]], dtype=tf.float32) print(tf.nn.softmax_cross_entropy_with_logits(labels, logits)) # [0.6931472, 0.9130153]
- Parameters:
- labels (dragon.Tensor) – The label tensor.
- logits (dragon.Tensor) – The logit tensor.
- name (str, optional) – The operation name.
- Returns:
dragon.Tensor – The output tensor.