NetKet 3.1 (under development)¶
Added Conversion methods
to_qobj()to operators and variational states, that produce QuTiP’s qobjects.
nk.nn.activation.reimhas been added that transforms a nonlinearity to act seperately on the real and imaginary parts
reim_reluhave been added
The default initializer for
netket.models.GCNNhas been changed to from
NetKet 3.0 (23 august 2021)¶
The default initializer for
netket.nn.Denselayers now matches the same default as
flax.linen, and it is
The default initializer for
netket.nn.DenseSymmlayers is now chosen in order to give variance 1 to every output channel, therefore defaulting to
NetKet 3.0b4 (17 august 2021)¶
DenseSymm now accepts a mode argument to specify whever the symmetries should be computed with a full dense matrix or FFT. The latter method is much faster for sufficiently large systems. Other kwargs have been added to satisfy the interface. The api changes are also reflected in RBMSymm and GCNN. #792
MCStatenow use dispatch to select the relevant implementation of the algorithm. They can therefore be expanded and overridden without editing NetKet’s source code. #804
netket.utils.mpi_availablehas been moved to
netket.utils.mpi.availableto have a more consistent api interface (all mpi-related properties in the same submodule). #827
netket.logging.TBLoghas been renamed to
netket.logging.TensorBoardLogfor better readability. A deprecation warning is now issued if the older name is used #827
MCStateinitializes a model by calling
model.init, the call is now jitted. This should speed it up for non-trivial models but might break non-jit invariant models. #832
operator.get_conn_paddednow supports arbitrarily-dimensioned bitstrings as input and reshapes the output accordingly. #834
NetKet’s implementation of dataclasses now support
pytree_node=True/Falseon cached properties. #835
Plum version has been bumped to 1.5.1 to avoid broken versions (1.4, 1.5). #856.
Numba version 0.54 is now allowed #857.
NetKet 3.0b3 (published on 9 july 2021)¶
The utils.group submodule provides utilities for geometrical and permutation groups.
Lattice(and its specialisations like
Grid) use these to automatically construct the space groups of lattices, as well as their character tables for generating wave functions with broken symmetry. #724
Autoregressive neural networks, sampler, and masked linear layers have been added to
graph.Gridclass has been removed. graph.Grid will now return an instance of graph.Lattice supporting the same API but with new functionalities related to spatial symmetries. The
color_edgesoptional keyword argument has been removed without deprecation. #724
MCState.n_discardhas been renamed
MCState.n_discard_per_chainand the old binding has been deprecated #739.
centered=Truehas been removed because we are now convinced the two options yielded equivalent results.
QGTOnTheFlynow always behaves as if
networkXhas been replaced by
igraph, yielding a considerable speedup for some graph-related operations #729.
netket.hilbert.randommodule now uses
netket.utils.dispatch) to select the correct implementation of
flip_state. This makes it easy to define new hilbert states and extend their functionality easily. #734.
The AbstractHilbert interface is now much smaller in order to also support continuous Hilbert spaces. Any functionality specific to discrete hilbert spaces (what was previously supported) has been moved to a new abstract type
nk.hilbert.DiscreteHilbert. Any Hilbert space previously subclassing nk.hilbert.AbstractHilbert should be modified to subclass nk.hilbert.DiscreteHilbert #800.
normalize=False, do not subtract the logarithm of the maximum value from the state #705.
Autoregressive networks now work with Fock space and give correct errors if the hilbert space is not supported #806.
Autoregressive networks are now much (x10-x100) faster #705.
Do not throw errors when calling
operator.get_conn_flattened(states)with a jax array #764.
Fix bug with the driver progress bar when
step_size != 1#747.
NetKet 3.0b2 (published on 31 May 2021)¶
Group Equivariant Neural Networks have been added to
Permutation invariant RBM and Permutation invariant dense layer have been added to
Add the property
SamplerState, computing the MPI-enabled acceptance ratio. #592.
StateLog, a new logger that stores the parameters of the model during the optimization in a folder or in a tar file. #645
A warning is now issued if NetKet detects to be running under
mpirunbut MPI dependencies are not installed #631
operator.LocalOperators now do not return a zero matrix element on the diagonal if the whole diagonal is zero. #623.
logger.JSONLognow automatically flushes at every iteration if it does not consume significant CPU cycles. #599
The interface of Stochastic Reconfiguration has been overhauled and made more modular. You can now specify the solver you wish to use, NetKet provides some dense solvers out of the box, and there are 3 different ways to compute the Quantum Geometric Tensor. Read the documentation to learn more about it. #674
Unless you specify the QGT implementation you wish to use with SR, we use an automatic heuristic based on your model and the solver to pick one. This might affect SR performance. #674
For all samplers,
n_chainsnow sets the total number of chains across all MPI ranks. This is a breaking change compared to the old API, where
n_chainswould set the number of chains on a single MPI rank. It is still possible to set the number of chains per MPI rank by specifying
n_chains. This change, while breaking allows us to be consistent with the interface of variational.MCState, where
n_samplesis the total number of samples across MPI nodes.
MetropolisSampler.reset_chainhas been renamed to
MetropolisSampler.reset_chains. Likewise in the constructor of all samplers.
Briefly during development releases
MetropolisSamplerState.acceptance_ratioreturned the percentage (not ratio) of acceptance.
acceptance_ratiois now deprecated in favour of the correct
models.Jastrownow internally symmetrizes the matrix before computing its value #644
MCState.evaluatehas been renamed to
nk.optimizer.SRno longer accepts keyword argument relative to the sparse solver. Those should be passed inside the closure or
nk.optimizer.sr.SRLazyGMREShave been deprecated and will soon be removed.
Parts of the
LatticeAPI have been overhauled, with deprecations of several methods in favor of a consistent usage of
Lattice.positionfor real-space location of sites and
Lattice.basis_coordsfor location of sites in terms of basis vectors.
Lattice.siteshas been added, which provides a sequence of
LatticeSiteobjects combining all site properties. Furthermore,
Latticenow provides lookup of sites from their position via
id_from_positionusing a hashing scheme that works across periodic boundaries. #703 #715
nk.variationalhas been renamed to
nk.vqsand will be removed in a future release.
operator.BoseHubbardusage under jax Hamiltonian Sampling #662
R->Cmodels with non homogeneous parameters #661
Fix MPI Compilation deadlock when computing expectation values #655
Fix bug preventing the creation of a
hilbert.SpinHilbert space with odd sites and even
NetKet 3.0b1 (published beta release)¶
Hilbert space constructors do not store the lattice graph anymore. As a consequence, the constructor does not accept the graph anymore.
operator.LocalOperatornow default to real-valued matrix elements, except if you construct them with a complex-valued matrix. This is also valid for operators such as :func:
When performing algebraic operations
*, -, +on pairs of
operator.LocalOperator, the dtype of the result iscomputed using standard numpy promotion logic.
Doing an operation in-place
+=, -=, *=on a real-valued operator will now fail if the other is complex. While this might seem annoying, it’s useful to ensure that smaller types such as
complex64are preserved if the user desires to do so.
AbstractMachinehas been removed. It’s functionality is now split among the model itself, which is defined by the user and
variational.MCStatefor pure states or
variational.MCMixedStatefor mixed states.
The model, in general is composed by two functions, or an object with two functions: an
init(rng, sample_val)function, accepting a jax.random.PRNGKey object and an input, returning the parameters and the state of the model for that particular sample shape, and a
apply(params, samples, **kwargs)function, evaluating the model for the given parameters and inputs.
Some models (previously machines) such as the RBM (Restricted Boltzmann Machine) Machine, NDM (Neural Density Matrix) or MPS (Matrix Product State ansatz) are available in Pre-built models.
Machines, now called models, should be written using Flax or another jax framework.
Serialization and deserialization functionality has now been moved to variational.MCState, which support the standard Flax interface through MsgPack. See Flax docs for more information
AbstractMachine.init_random_parametersfunctionality has now been absorbed into
netket.vqs.VariationalState.init_parameters(), which however has a different syntax.
Samplers now require the Hilbert space upon which they sample to be passed in to the constructor. Also note that several keyword arguments of the samplers have changed, and new one are available.
It’s now possible to change Samplers dtype, which controls the type of the output. By default they use double-precision samples (
np.float64). Be wary of type promotion issues with your models.
Samplers no longer take a machine as an argument.
Samplers are now immutable (frozen)
flax.struct.dataclass) that only hold the sampling parameters. As a consequence it is no longer possible to change their settings such as
n_sweepswithout creating a new sampler. If you wish to update only one parameter, it is possible to construct the new sampler with the updated value by using the
Samplers are no longer stateful objects. Instead, they can construct an immutable state object sampler.init_state, which can be passed to sampling functions such as sampler.sample, which now return also the updated state. However, unless you have particular use-cases we advise you use the variational state MCState instead.
The Optimizer module has been overhauled, and now only re-exports flax optim module. We advise not to use netket’s optimizer but instead to use optax .
The SR object now is only a set of options used to compute the SR matrix. The SR matrix, now called
quantum_geometric_tensorcan be obtained by calling
variational.MCState.quantum_geometric_tensor(). Depending on the settings, this can be a lazy object.
netket.Vmc has been renamed to netkt.VMC
netket.models.RBM replaces the old
RBMmachine, but has real parameters by default.
As we rely on Jax, using
dtype=complex, which are weak types, will sometimes lead to loss of precision because they might be converted to
np.complex128instead if you want double precision when defining your models.