relu6

dragon.vm.tensorflow.nn.relu6(
  features,
  name=None,
  **kwargs
)[source]

Apply the clipped-6 rectified linear unit. [Krizhevsky, 2010].

The ReLU-6 function is defined as:

\[\text{ReLU-6}(x) = \begin{cases} \min(x, 6), & \text{ if } x \geq 0 \\ 0, & \text{ otherwise } \end{cases} \]
Parameters:
  • features (dragon.Tensor) – The input tensor.
  • name (str, optional) – The operation name.
Returns:

dragon.Tensor – The output tensor.