class netket.models.FastARNNSequential[source]#

Bases: ARNNSequential

Implementation of a fast ARNN that sequentially calls its layers and activation function.

Subclasses must implement activation as a field or a method, and assign a list of fast ARNN layers to self._layers in setup.

The fast autoregressive sampling is described in Ramachandran et. {it al}. To generate one sample using an autoregressive network, we need to evaluate the network N times, where N is the number of input sites. But actually we only change one input site each time, and not all intermediate results depend on the changed input because of the autoregressive property, so we can cache unchanged intermediate results and avoid repeated computation.

This optimization is particularly useful for convolutional neural networks (CNN) and recurrent neural networks (RNN) where each output site of a layer only depends on a small number of input sites, while not so useful for densely connected layers.

hilbert: HomogeneousHilbert#

the Hilbert space. Only homogeneous unconstrained Hilbert spaces are supported.


Computes the log wave-functions for input configurations.


inputs (Union[ndarray, Array]) – configurations with dimensions (batch, Hilbert.size).

Return type:

Union[ndarray, Array]


The log psi with dimension (batch,).

conditional(inputs, index)[source]#

Computes the conditional probabilities for one site to take each value. See AbstractARNN.conditional.

Return type:

Union[ndarray, Array]