ReLU¶
- class
dragon.vm.torch.nn.
ReLU
(inplace=False)[source]¶ Apply rectified linear unit. [Nair & Hinton, 2010].
The ReLU function is defined as:
\[\text{ReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ 0, & \text{ otherwise } \end{cases} \]Examples:
m = torch.nn.ReLU() x = torch.randn(2, 3) y = m(x)
See also