RMSProp is a well-known update algorithm proposed by Geoff Hinton in his Neural Networks course notes Neural Networks course notes. It corrects the problem with AdaGrad by using an exponentially weighted moving average over past squared gradients instead of a cumulative sum. After initializing the vector to zero, and t he parameters are updated as
Constructs a new
|learning_rate||float=0.001||The learning rate|
|beta||float=0.9||Exponential decay rate.|
|epscut||float=1e-07||Small cutoff value.|
>>> from netket.optimizer import RmsProp >>> op = RmsProp(learning_rate=0.02)
Member function resetting the internal state of the optimizer.