softmax¶
- dragon.vm.tensorflow.nn.- softmax(
 logits,
 axis=- 1,
 name=None,
 **kwargs
 )[source]¶
- Apply the softmax function. - The Softmax function is defined as: \[\text{Softmax}(x_{i}) = \frac{\exp(x_{i})}{\sum_{j} \exp(x_{j})} \]- The argument - axiscould be negative:- x = tf.ones((1, 4), dtype='float32') print(tf.nn.softmax(x, 1)) # [[0.25 0.25 0.25 0.25]] print(tf.nn.softmax(x, -1)) # Equivalent - Parameters:
- logits (dragon.Tensor) – The input tensor.
- axis (int, optional, default=-1) – The axis to reduce.
- name (str, optional) – The operation name.
 
 - Returns:
- dragon.Tensor – The output tensor. 
 
