netket.optimizer.Adam¶

netket.optimizer.Adam(learning_rate=0.001, b1=0.9, b2=0.999, eps=1e-08)[source]¶

Adam Optimizer.

Parameters
  • learning_rate (float) – Learning rate \(\eta\).

  • b1 (float) –

  • b2 (float) –

  • epscut – Small cutoff value.

  • initial_accumulator_value – initial value of the accumulator