netket.nn#
This sub-module extends flax.linen with layers and tools that are useful to applications in quantum physics. Read more about the design goal of this module in their README
Linear Modules#
Implements a projection onto a symmetry group. |
|
A group convolution operation that is equivariant over a symmetry group. |
The following modules can be used in autoregressive neural networks, see AbstractARNN
1D linear transformation module with mask for autoregressive NN. |
|
1D convolution module with mask for autoregressive NN. |
|
2D convolution module with mask for autoregressive NN. |
|
1D linear transformation module with mask for fast autoregressive NN. |
|
1D convolution module with mask for fast autoregressive NN. |
|
2D convolution module with mask for fast autoregressive NN. |
Activation functions#
Modifies a non-linearity to act separately on the real and imaginary parts |
|
relu applied separately to the real andimaginary parts of it's input. |
|
selu applied separately to the real andimaginary parts of it's input. |
|
Logarithm of the hyperbolic cosine, implemented in a more stable way. |
|
Logarithm of the hyperbolic sine. |
|
Logarithm of the hyperbolic tangent. |
Miscellaneous Functions#
|
Encodes the array x into a set of binary-encoded variables described by the shape of a Hilbert space. |
Utility functions#
Blocks#
A Multi-Layer Perceptron with hidden layers. |
|
Implements the DeepSets architecture, which is permutation invariant and is suitable for the encoding of bosonic systems. |
|
A flax module symmetrizing the log-wavefunction \(\log\psi_\theta(\sigma)\) encoded into another flax module ( |
Experimental#
Recurrent Neural Network cells#
The following are RNN layers (in flax those would be called a RNN), which can be stacked within a netket.experimental.models.RNN
.
Recurrent neural network layer that maps inputs at N sites to outputs at N sites. |
|
Recurrent neural network layer with fast sampling. |
The following are recurrent cells that can be used with netket.experimental.nn.rnn.RNNLayer
.
Recurrent neural network cell that updates the hidden memory at each site. |
|
Long short-term memory cell. |
|
Gated recurrent unit cell. |
The following are utility functions to build up custom autoregressive orderings.
Check that the reordering indices determining the autoregressive order of an RNN are correctly declared. |
|
Deduce the missing arguments between reorder_idx, inv_reorder_idx, and inv_reorder_idx from the specified arguments. |
|
A helper function to generate the inverse reorder indices in the snake order for a 2D graph. |