netket.optimizer.RmsProp

Contents

netket.optimizer.RmsProp#

netket.optimizer.RmsProp(learning_rate=0.001, beta=0.9, epscut=1e-07, centered=False)[source]#

RMSProp optimizer.

RMSProp is a well-known update algorithm proposed by Geoff Hinton in his Neural Networks course notes Neural Networks course notes. It corrects the problem with AdaGrad by using an exponentially weighted moving average over past squared gradients instead of a cumulative sum. After initializing the vector \(\mathbf{s}\) to zero, \(s_k\) and t he parameters \(p_k\) are updated as

\[\begin{split}s^\prime_k = \beta s_k + (1-\beta) G_k(\mathbf{p})^2 \\ p^\prime_k = p_k - \frac{\eta}{\sqrt{s_k}+\epsilon} G_k(\mathbf{p})\end{split}\]

Constructs a new RmsProp optimizer.

Parameters:
  • learning_rate (float) – The learning rate \(\eta\)

  • beta (float) – Exponential decay rate.

  • epscut (float) – Small cutoff value.

  • centered (bool) – whether to center the moving average.

Examples

RmsProp optimizer.

>>> from netket.optimizer import RmsProp
>>> op = RmsProp(learning_rate=0.02)