netket.models#

This sub-module contains several pre-built models to be used as neural quantum states.

Generic models#

This section lists some simple variational architectures.

LogStateVector

_Exact_ ansatz storing the logarithm of the full, exponentially large wavefunction coefficients.

RBM

A restricted boltzman Machine, equivalent to a 2-layer FFNN with a nonlinear activation function in between.

RBMModPhase

A fully connected Restricted Boltzmann Machine (RBM) with real-valued parameters.

RBMMultiVal

A fully connected Restricted Boltzmann Machine (see netket.models.RBM) suitable for large local hilbert spaces.

RBMSymm

A symmetrized RBM using the netket.nn.DenseSymm() layer internally.

Jastrow

Jastrow wave function \(\Psi(s) = \exp(\sum_{i \neq j} s_i W_{ij} s_j)\), where W is a symmetric matrix.

MPSPeriodic

A periodic Matrix Product State (MPS) for a quantum state of discrete degrees of freedom, wrapped as Jax machine.

NDM

Encodes a Positive-Definite Neural Density Matrix using the ansatz from Torlai and Melko, PRL 120, 240503 (2018).

GCNN

Implements a Group Convolutional Neural Network (G-CNN) that outputs a wavefunction that is invariant over a specified symmetry group.

DeepSetMLP

Implements the DeepSets architecture, which is permutation invariant.

MLP

A Multi-Layer Perceptron with hidden layers.

Autoregressive models#

The following autoregressive models can be directly sampled using ARDirectSampler.

Those that follow are the abstract classes that can be inherited from in order to build an autoregressive model.

AbstractARNN

Base class for autoregressive neural networks.

ARNNSequential

Implementation of an ARNN that sequentially calls its layers, and optionally an activation function.

FastARNNSequential

Implementation of a fast ARNN that sequentially calls its layers and activation function.

And those are some default implementation of Dense and Convolutional-autoregressive Neural networks. Those are built by using masked dense and masked convolutional layers.

ARNNDense

Autoregressive neural network with dense layers.

ARNNConv1D

Autoregressive neural network with 1D convolution layers.

ARNNConv2D

Autoregressive neural network with 2D convolution layers.

FastARNNConv1D

Fast autoregressive neural network with 1D convolution layers.

FastARNNConv2D

Fast autoregressive neural network with 2D convolution layers.

Continuous degrees of freedom#

The following models are particularly suited for systems with continuous degrees of freedom (Particle)

Gaussian

Multivariate Gaussian function with mean 0 and parametrised covariance matrix \(\Sigma_{ij}\).

DeepSetRelDistance

Implements an equivariant version of the DeepSets architecture given by (https://arxiv.org/abs/1703.06114)

Experimental models#

The following models are experimental, meaning that we could change them at some point, and we actively seeking for feedback and opinions on their usage and APIs.

Fermionic models#

The following models are for 2nd-quantisation fermionic hilbert spaces (netket.experimental.hilbert.SpinOrbitalFermions).

Slater2nd

A slater determinant ansatz for second-quantised spinless or spin-full fermions.

Recurrent Neural Networks (RNN)#

The following are abstract models for Recurrent neural networks (and their fast versions).

RNN

Base class for recurrent neural networks.

FastRNN

Base class for recurrent neural networks with fast sampling.

The following are concrete, ready to use versions of Recurrent Neural Networks

LSTMNet

Long short-term memory network.

FastLSTMNet

Long short-term memory network with fast sampling.

GRUNet1D

Gated recurrent unit network.

FastGRUNet1D

Gated recurrent unit network with fast sampling.