netket.optimizer.Sgd

Contents

netket.optimizer.Sgd#

netket.optimizer.Sgd(learning_rate)[source]#

Stochastic Gradient Descent Optimizer. The Stochastic Gradient Descent is one of the most popular optimizers in machine learning applications. Given a stochastic estimate of the gradient of the cost function (\(G(\mathbf{p})\)), it performs the update:

\[p^\prime_k = p_k -\eta G_k(\mathbf{p}),\]

where \(\eta\) is the so-called learning rate. NetKet also implements two extensions to the simple SGD, the first one is \(L_2\) regularization, and the second one is the possibility to set a decay factor \(\gamma \leq 1\) for the learning rate, such that at iteration \(n\) the learning rate is \(\eta \gamma^n\).

Parameters:

learning_rate (float) – The learning rate \(\eta\).

Examples

Simple SGD optimizer.

>>> from netket.optimizer import Sgd
>>> op = Sgd(learning_rate=0.05)