netket.optimizer.AdamΒΆ

netket.optimizer.Adam(learning_rate=0.001, b1=0.9, b2=0.999, eps=1e-08)[source]ΒΆ

Adam Optimizer.

Parameters
  • learning_rate (float) – Learning rate \(\eta\).

  • b1 (float) – Decay rate for the exponentially weighted average of grads.

  • b2 (float) – Decay rate for the exponentially weighted average of squared norm of grads.

  • eps – Term added to the denominator to improve numerical stability.