netket.nn.activation.reim_relu

Contents

netket.nn.activation.reim_relu#

netket.nn.activation.reim_relu()#

relu applied separately to the real andimaginary parts of it’s input.

The docstring to the original function follows.

Rectified linear unit activation function.

Computes the element-wise function:

\[\mathrm{relu}(x) = \max(x, 0)\]

except under differentiation, we take:

\[\nabla \mathrm{relu}(0) = 0\]

For more information see Numerical influence of ReLU’(0) on backpropagation.

Args:

x : input array

Returns:

An array.

Example:
>>> jax.nn.relu(jax.numpy.array([-2., -1., -0.5, 0, 0.5, 1., 2.]))
Array([0. , 0. , 0. , 0. , 0.5, 1. , 2. ], dtype=float32)
See also:

relu6()

Return type:

Array