kl_div¶
- dragon.vm.torch.nn.functional.- kl_div(
 input,
 target,
 size_average=None,
 reduce=None,
 reduction='mean',
 log_target=False
 )[source]¶
- Compute the Kullback-Leibler divergence. - Parameters:
- input (dragon.vm.torch.Tensor) – The input tensor.
- target (dragon.vm.torch.Tensor) – The target tensor.
- size_average (bool, optional) – Whether to average the loss.
- reduce (bool, optional) – Whether to reduce the loss.
- reduction ({'none', 'batchmean', 'mean', 'sum'}, optional) – The reduce method.
- log_target (bool, optional, default=False) – The flag indicating whether targetis passed in log space.
 
 - Returns:
- dragon.vm.torch.Tensor – The output tensor. 
 - See also 
