RmsProp

RMSProp is a well-known update algorithm proposed by Geoff Hinton in his Neural Networks course notes Neural Networks course notes. It corrects the problem with AdaGrad by using an exponentially weighted moving average over past squared gradients instead of a cumulative sum. After initializing the vector to zero, and t he parameters are updated as

Class Constructor

Constructs a new RmsProp optimizer.

Argument Type Description
learning_rate float=0.001 The learning rate
beta float=0.9 Exponential decay rate.
epscut float=1e-07 Small cutoff value.

Examples

RmsProp optimizer.

>>> from netket.optimizer import RmsProp
>>> op = RmsProp(learning_rate=0.02)

Class Methods

reset

Member function resetting the internal state of the optimizer.