netket.nn.activation.reim_selu

Contents

netket.nn.activation.reim_selu#

netket.nn.activation.reim_selu()#

selu applied separately to the real andimaginary parts of it’s input.

The docstring to the original function follows.

Scaled exponential linear unit activation.

Computes the element-wise function:

\[\begin{split}\mathrm{selu}(x) = \lambda \begin{cases} x, & x > 0\\ \alpha e^x - \alpha, & x \le 0 \end{cases}\end{split}\]

where \(\lambda = 1.0507009873554804934193349852946\) and \(\alpha = 1.6732632423543772848170429916717\).

For more information, see Self-Normalizing Neural Networks.

Args:

x : input array

Returns:

An array.

See also:

elu()

Return type:

Array