SoftmaxWithLoss

class dragon.vm.caffe.core.layers.SoftmaxWithLoss(layer_param)[source]

Compute the softmax cross entropy with sparse labels.

The CrossEntropy function is defined as:

\[\text{CrossEntropy}(p_{t}) = -\log(p_{t}) \]

Examples:

layer {
  type: "SoftmaxWithLoss"
  bottom: "cls_score"
  bottom: "labels"
  top: "cls_loss"
  softmax_param {
    axis: 1
  }
  loss_param {
    ignore_label: -1
    normalization: VALID
  }
}