Simple Stochastic Gradient Descent Optimizer. Stochastic Gradient Descent is one of the most popular optimizers in machine learning applications. Given a stochastic estimate of the gradient of the cost function (), it performs the update:
where is the so-called learning rate. NetKet also implements two extensions to the simple SGD, the first one is regularization, and the second one is the possibility to set a decay factor for the learning rate, such that at iteration the learning rate is .
Constructs a new
|learning_rate||float||The learning rate|
|l2_reg||float=0||The amount of regularization.|
|decay_factor||float=1.0||The decay factor .|
Simple SGD optimizer.
>>> from netket.optimizer import Sgd >>> op = Sgd(learning_rate=0.05)
Member function resetting the internal state of the optimizer.