ReLU6¶
- class
dragon.vm.torch.nn.
ReLU6
(inplace=False)[source]¶ Apply the clipped-6 rectified linear unit. [Krizhevsky, 2010].
The ReLU-6 function is defined as:
\[\text{ReLU-6}(x) = \begin{cases} \min(x, 6), & \text{ if } x \geq 0 \\ 0, & \text{ otherwise } \end{cases} \]Examples:
m = torch.nn.ReLU6() x = torch.tensor([-2, 0, 2, 4, 6, 8], 'float32') y = m(x)
See also