netket.models#
This sub-module contains several pre-built models to be used as neural quantum states.
Generic models#
This section lists some simple variational architectures.
_Exact_ ansatz storing the logarithm of the full, exponentially large wavefunction coefficients. |
|
A restricted boltzman Machine, equivalent to a 2-layer FFNN with a nonlinear activation function in between. |
|
A fully connected Restricted Boltzmann Machine (RBM) with real-valued parameters. |
|
A fully connected Restricted Boltzmann Machine (see |
|
A symmetrized RBM using the |
|
Jastrow wave function \(\Psi(s) = \exp(\sum_{i \neq j} s_i W_{ij} s_j)\), where W is a symmetric matrix. |
|
Encodes a Positive-Definite Neural Density Matrix using the ansatz from Torlai and Melko, PRL 120, 240503 (2018). |
|
Implements a Group Convolutional Neural Network (G-CNN) that outputs a wavefunction that is invariant over a specified symmetry group. |
|
Implements the DeepSets architecture, which is permutation invariant. |
|
A Multi-Layer Perceptron with hidden layers. |
Autoregressive models#
The following autoregressive models can be directly sampled using ARDirectSampler
.
Those that follow are the abstract classes that can be inherited from in order to build an autoregressive model.
Base class for autoregressive neural networks. |
|
Implementation of an ARNN that sequentially calls its layers, and optionally an activation function. |
|
Implementation of a fast ARNN that sequentially calls its layers and activation function. |
And those are some default implementation of Dense and Convolutional-autoregressive Neural networks. Those are built by using masked dense and masked convolutional layers.
Autoregressive neural network with dense layers. |
|
Autoregressive neural network with 1D convolution layers. |
|
Autoregressive neural network with 2D convolution layers. |
|
Fast autoregressive neural network with 1D convolution layers. |
|
Fast autoregressive neural network with 2D convolution layers. |
Continuous degrees of freedom#
The following models are particularly suited for systems with continuous degrees of freedom (Particle
)
Multivariate Gaussian function with mean 0 and parametrised covariance matrix \(\Sigma_{ij}\). |
|
Implements an equivariant version of the DeepSets architecture given by (https://arxiv.org/abs/1703.06114) |
Tensor Networks#
The following models are tensor networks, that can be used as variational ansatzes:
An open Matrix Product State (MPS) for a quantum state of discrete degrees of freedom. |
|
A periodic Matrix Product State (MPS) for a quantum state of discrete degrees of freedom. |
|
A Matrix Product Density Operator (MPDO) with open boundary conditions for a quantum mixed state of discrete degrees of freedom. |
|
A Matrix Product Density Operator (MPDO) with periodic boundary conditions for a quantum mixed state of discrete degrees of freedom. |
Experimental models#
The following models are experimental, meaning that we could change them at some point, and we actively seeking for feedback and opinions on their usage and APIs.
Fermionic models#
The following models are for 2nd-quantisation fermionic hilbert spaces (SpinOrbitalFermions
).
A slater determinant ansatz for second-quantised spinless or spin-full fermions. |
|
A slater determinant ansatz for second-quantised spinless or spin-full fermions with a sum of determinants. |
Recurrent Neural Networks (RNN)#
The following are abstract models for Recurrent neural networks (and their fast versions).
Base class for recurrent neural networks. |
|
Base class for recurrent neural networks with fast sampling. |
The following are concrete, ready to use versions of Recurrent Neural Networks
Long short-term memory network. |
|
Long short-term memory network with fast sampling. |
|
Gated recurrent unit network. |
|
Gated recurrent unit network with fast sampling. |