netket.optimizer.SRLazyCG

class netket.optimizer.SRLazyCG(diag_shift=0.01, tol=1e-05, atol=0.0, maxiter=None, M=None, centered=True)[source]

Bases: netket.optimizer.sr.sr_onthefly.SRLazy

Computes x = ⟨S⟩⁻¹⟨F⟩ by using an iterative conjugate gradient method.

See Jax docs for more informations.

__init__(diag_shift=0.01, tol=1e-05, atol=0.0, maxiter=None, M=None, centered=True)

Initialize self. See help(type(self)) for accurate signature.

Parameters
  • diag_shift (float) –

  • tol (float) –

  • atol (float) –

  • maxiter (Optional[int]) –

  • M (Optional[Union[Callable, Any]]) –

  • centered (bool) –

Return type

None

Attributes
M: Optional[Union[Callable, Any]] = None

Preconditioner for A. The preconditioner should approximate the inverse of A. Effective preconditioning dramatically improves the rate of convergence, which implies that fewer iterations are needed to reach a given error tolerance.

atol: float = 0.0

Absolutes tolerance for convergences.

centered: bool = True

Uses S=⟨ΔÔᶜΔÔ⟩ if True (default), S=⟨ÔᶜΔÔ⟩ otherwise. The two forms are mathematically equivalent, but might lead to different results due to numerical precision. The non-centered variaant should bee approximately 33% faster.

diag_shift: float = 0.01

Diagonal shift added to the S matrix.

maxiter: int = None

Maximum number of iterations. Iteration will stop after maxiter steps even if the specified tolerance has not been achieved.

tol: float = 1e-05

Relative tolerance for convergences.

Methods
create(*args, **kwargs)
replace(**updates)

“Returns a new object replacing the specified fields with new values.

solve_fun()[source]