netket.driver.VMC#

class netket.driver.VMC[source]#

Bases: AbstractVariationalDriver

Energy minimization using Variational Monte Carlo (VMC).

Inheritance
Inheritance diagram of netket.driver.VMC
__init__(hamiltonian, optimizer, *, variational_state, preconditioner=<netket.optimizer.preconditioner.IdentityPreconditioner object>)[source]#

Initializes the driver class.

Parameters:
  • hamiltonian (AbstractOperator) – The Hamiltonian of the system.

  • optimizer (Any) – Determines how optimization steps are performed given the bare energy gradient.

  • variational_state (VariationalState) – The variational state for which the hamiltonian must be minimised.

  • preconditioner (Callable[[VariationalState, Any, Any | None], Any]) – Determines which preconditioner to use for the loss gradient. This must be a tuple of (object, solver) as documented in the section preconditioners in the documentation. The standard preconditioner included with NetKet is Stochastic Reconfiguration. By default, no preconditioner is used and the bare gradient is passed to the optimizer.

Attributes
energy#

Return MCMC statistics for the expectation value of observables in the current state of the driver.

optimizer#

The optimizer used to update the parameters at every iteration.

preconditioner#

The preconditioner used to modify the gradient.

This is a function with the following signature

precondtioner(vstate: VariationalState,
              grad: PyTree,
              step: Optional[Scalar] = None)

Where the first argument is a variational state, the second argument is the PyTree of the gradient to precondition and the last optional argument is the step, used to change some parameters along the optimisation.

Often, this is taken to be SR(). If it is set to None, then the identity is used.

state#

Returns the machine that is optimized by this driver.

step_count#

Returns a monotonic integer labelling all the steps performed by this driver. This can be used, for example, to identify the line in a log file.

Methods
advance(steps=1)[source]#

Performs steps optimization steps.

Parameters:

steps (int) – (Default=1) number of steps.

estimate(observables)[source]#

Return MCMC statistics for the expectation value of observables in the current state of the driver.

Parameters:

observables – A pytree of operators for which statistics should be computed.

Returns:

A pytree of the same structure as the input, containing MCMC statistics for the corresponding operators as leaves.

iter(n_steps, step=1)[source]#

Returns a generator which advances the VMC optimization, yielding after every step_size steps.

Parameters:
  • n_steps (int) – The total number of steps to perform (this is equivalent to the length of the iterator)

  • step (int) – The number of internal steps the simulation is advanced between yielding from the iterator

Yields:

int – The current step.

reset()[source]#

Resets the driver.

Subclasses should make sure to call super().reset() to ensure that the step count is set to 0.

run(n_iter, out=(), obs=None, step_size=1, show_progress=True, save_params_every=50, write_every=50, callback=<function AbstractVariationalDriver.<lambda>>, timeit=False)[source]#

Runs this variational driver, updating the weights of the network stored in this driver for n_iter steps and dumping values of the observables obs in the output logger.

It is possible to control more specifically what quantities are logged, when to stop the optimisation, or to execute arbitrary code at every step by specifying one or more callbacks, which are passed as a list of functions to the keyword argument callback.

Callbacks are functions that follow this signature:

def callback(step, log_data, driver) -> bool:
    ...
    return True/False

If a callback returns True, the optimisation continues, otherwise it is stopped. The log_data is a dictionary that can be modified in-place to change what is logged at every step. For example, this can be used to log additional quantities such as the acceptance rate of a sampler.

Loggers are specified as an iterable passed to the keyword argument out. If only a string is specified, this will create by default a nk.logging.JsonLog. To know about the output format check its documentation. The logger object is also returned at the end of this function so that you can inspect the results without reading the json output.

When running among multiple MPI ranks/Jax devices, the logging logic is executed on all nodes, but only root-rank loggers should write to files or do expensive I/O operations.

Note

Before NetKet 3.15, loggers where automatically β€˜ignored’ on non-root ranks. However, starting with NetKet 3.15 it is the responsability of a logger to check if it is executing on a non-root rank, and to β€˜do nothing’ if that is the case.

The change was required to work correctly and efficiently with sharding. It will only affect users that were defining custom loggers themselves.

Parameters:
  • n_iter (int) – the total number of iterations to be performed during this run.

  • out (AbstractLog | Iterable[AbstractLog] | str | None) – A logger object, or an iterable of loggers, to be used to store simulation log and data. If this argument is a string, it will be used as output prefix for the standard JSON logger.

  • obs (dict[str, AbstractObservable] | None) – An iterable containing all observables that should be computed

  • step_size (int) – Every how many steps should observables be logged to disk (default=1)

  • callback (Callable[[int, dict, AbstractVariationalDriver], bool] | Iterable[Callable[[int, dict, AbstractVariationalDriver], bool]]) – Callable or list of callable callback functions to stop training given a condition

  • show_progress (bool) – If true displays a progress bar (default=True)

  • save_params_every (int) – Every how many steps the parameters of the network should be serialized to disk (ignored if logger is provided)

  • write_every (int) – Every how many steps the json data should be flushed to disk (ignored if logger is provided)

  • timeit (bool) – If True, provide timing information.

update_parameters(dp)[source]#

Updates the parameters of the machine using the optimizer in this driver

Parameters:

dp – the pytree containing the updates to the parameters