netket.experimental.models.FastRNN#
- class netket.experimental.models.FastRNN[source]#
Bases:
FastARNNSequential
Base class for recurrent neural networks with fast sampling.
The fast autoregressive sampling is described in Ramachandran et. {it al}. To generate one sample using an autoregressive network, we need to evaluate the network N times, where N is the number of input sites. But actually we only change one input site each time, and not all intermediate results depend on the changed input because of the autoregressive property, so we can cache unchanged intermediate results and avoid repeated computation.
This optimization is particularly useful for RNN where each output site of a layer only depends on a small number of input sites. In the slow RNN, we need to run N RNN steps in each layer during each AR sampling step. While in the fast RNN, we cache the relevant hidden memories in each layer from the previous AR sampling step, and only run one RNN step to update from the changed input.
See
netket.experimental.models.RNN
for explanation of the arguments related to the autoregressive order.- Attributes
-
graph:
AbstractGraph
|None
= None# graph of the physical system.
-
inv_reorder_idx:
HashableArray
|None
= None# indices to transform the inputs from ordered to unordered. See
netket.models.AbstractARNN.reorder()
for details.
-
prev_neighbors:
HashableArray
|None
= None# previous neighbors of each site.
-
reorder_idx:
HashableArray
|None
= None# indices to transform the inputs from unordered to ordered. See
netket.models.AbstractARNN.reorder()
for details.
-
features:
Iterable
[int
] |int
# output feature density in each layer. If a single number is given, all layers except the last one will have the same number of features.
- hilbert: HomogeneousHilbert#
the Hilbert space. Only homogeneous unconstrained Hilbert spaces are supported.
-
graph:
- Methods
-
- inverse_reorder(inputs, axis=0)[source]#
Transforms an array from ordered to unordered. See reorder.
- reorder(inputs, axis=0)[source]#
Transforms an array from unordered to ordered.
We call a 1D array ‘unordered’ if we need non-trivial indexing to access its elements in the autoregressive order, e.g., a[0], a[1], a[3], a[2] for the snake order. Otherwise, we call it ‘ordered’.
The inputs of conditionals_log_psi, conditionals, conditional, and __call__ are assumed to have unordered layout, and those inputs are always transformed through reorder before evaluating the network.
Subclasses may override reorder and inverse_reorder together to define this transformation.