netket.models.NDM

class netket.models.NDM(dtype=<class 'numpy.float64'>, activation=<function log_cosh>, alpha=1, beta=1, use_hidden_bias=True, use_ancilla_bias=True, use_visible_bias=True, precision=None, kernel_init=<function normal.<locals>.init>, bias_init=<function normal.<locals>.init>, visible_bias_init=<function normal.<locals>.init>, parent=<flax.linen.module._Sentinel object>, name=None)[source]

Bases: flax.linen.module.Module

Encodes a Positive-Definite Neural Density Matrix using the ansatz from Torlai and Melko, PRL 120, 240503 (2018).

Assumes real dtype. A discussion on the effect of the feature density for the pure and mixed part is given in Vicentini et Al, PRL 122, 250503 (2019).

Attributes
alpha: Union[float, int] = 1

The feature density for the pure-part of the ansatz. Number of features equal to alpha * input.shape[-1]

beta: Union[float, int] = 1

The feature density for the mixed-part of the ansatz. Number of features equal to beta * input.shape[-1]

precision: Any = None

numerical precision of the computation see `jax.lax.Precision`for details.

use_ancilla_bias: bool = True

if True uses a bias in the dense layer (hidden layer bias).

use_hidden_bias: bool = True

if True uses a bias in the dense layer (hidden layer bias).

use_visible_bias: bool = True

if True adds a bias to the input not passed through the nonlinear layer.

variables

Returns the variables in this module.

Return type

Mapping[str, Mapping[str, Any]]

Methods
activation()
bias_init(shape, dtype=<class 'jax._src.numpy.lax_numpy.float32'>)
kernel_init(shape, dtype=<class 'jax._src.numpy.lax_numpy.float32'>)
visible_bias_init(shape, dtype=<class 'jax._src.numpy.lax_numpy.float32'>)