class netket.models.RBMSymm(symmetries, dtype=<class 'numpy.float64'>, activation=<function log_cosh>, alpha=1, use_hidden_bias=True, use_visible_bias=True, precision=None, kernel_init=<function normal.<locals>.init>, hidden_bias_init=<function normal.<locals>.init>, visible_bias_init=<function normal.<locals>.init>, parent=<flax.linen.module._Sentinel object>, name=None)[source]

Bases: flax.linen.module.Module

A symmetrized RBM using the netket.nn.DenseSymm layer internally.

alpha: Union[float, int] = 1

feature density. Number of features equal to alpha * input.shape[-1]

precision: Any = None

numerical precision of the computation see `jax.lax.Precision`for details.

use_hidden_bias: bool = True

if True uses a bias in the dense layer (hidden layer bias).

use_visible_bias: bool = True

if True adds a bias to the input not passed through the nonlinear layer.


Returns the variables in this module.

Return type

Mapping[str, Mapping[str, Any]]

kernel_init(shape, dtype=<class 'jax._src.numpy.lax_numpy.float64'>)
visible_bias_init(shape, dtype=<class 'jax._src.numpy.lax_numpy.float64'>)