Public API: netket package

Graph

netket.graph.AbstractGraph

Abstract class for NetKet graph objects.

netket.graph.Graph

A simple implementation of Graph based on an external graph library.

netket.graph.Edgeless

Construct a set graph (collection of unconnected vertices).

netket.graph.Hypercube

Constructs a hypercubic lattice with equal side length in all dimensions.

netket.graph.Lattice

A lattice built by periodic arrangement of a given unit cell.

netket.graph.lattice.LatticeSite

Contains information about a single Lattice site.

netket.graph.Chain

Constructs a chain of length sites.

netket.graph.Grid

Constructs a hypercubic lattice given its extent in all dimensions.

Hilbert

netket.hilbert.AbstractHilbert

Abstract class for NetKet hilbert objects.

netket.hilbert.CustomHilbert

A custom hilbert space with discrete local quantum numbers.

netket.hilbert.DoubledHilbert

Superoperatorial hilbert space for states living in the tensorised state \(\hat{H}\otimes \hat{H}\), encoded according to Choi’s isomorphism.

netket.hilbert.Fock

Hilbert space obtained as tensor product of local fock basis.

netket.hilbert.Qubit

Hilbert space obtained as tensor product of local qubit states.

netket.hilbert.Spin

Hilbert space obtained as tensor product of local spin states.

Operators

netket.operator.AbstractOperator

Abstract class for quantum Operators.

netket.operator.DiscreteOperator

This class is the base class for operators defined on a discrete Hilbert space.

netket.operator.BoseHubbard

An extended Bose Hubbard model Hamiltonian operator, containing both on-site interactions and nearest-neighboring density-density interactions.

netket.operator.GraphOperator

A graph-based quantum operator.

netket.operator.LocalOperator

A custom local operator.

netket.operator.Ising

The Transverse-Field Ising Hamiltonian \(-h\sum_i \sigma_i^{(x)} +J\sum_{\langle i,j\rangle} \sigma_i^{(z)}\sigma_j^{(z)}\).

netket.operator.Heisenberg

The Heisenberg hamiltonian on a lattice.

netket.operator.PauliStrings

A Hamiltonian consisiting of the sum of products of Pauli operators.

netket.operator.LocalLiouvillian

LocalLiouvillian super-operator, acting on the DoubledHilbert (tensor product) space ℋ⊗ℋ.

Pre-defined operators

netket.operator.boson.create

Builds the boson creation operator \(\hat{a}^\dagger\) acting on the site-th of the Hilbert space hilbert.

netket.operator.boson.destroy

Builds the boson destruction operator \(\hat{a}\) acting on the site-th of the Hilbert space hilbert.

netket.operator.boson.number

Builds the number operator \(\hat{a}^\dagger\hat{a}\) acting on the site-th of the Hilbert space hilbert.

netket.operator.boson.proj

Builds the projector operator \(|n\rangle\langle n |\) acting on the site-th of the Hilbert space hilbert and collapsing on the state with n bosons.

netket.operator.spin.sigmax

Builds the \(\sigma^x\) operator acting on the site-th of the Hilbert space hilbert.

netket.operator.spin.sigmay

Builds the \(\sigma^y\) operator acting on the site-th of the Hilbert space hilbert.

netket.operator.spin.sigmaz

Builds the \(\sigma^z\) operator acting on the site-th of the Hilbert space hilbert.

netket.operator.spin.sigmap

Builds the \(\sigma^{+} = \frac{1}{2}(\sigma^x + i \sigma^y)\) operator acting on the site-th of the Hilbert space hilbert.

netket.operator.spin.sigmam

Builds the \(\sigma^{-} = \frac{1}{2}(\sigma^x - i \sigma^y)\) operator acting on the site-th of the Hilbert space hilbert.

Exact solvers

netket.exact.full_ed

Computes all eigenvalues and, optionally, eigenvectors of a Hermitian operator by full diagonalization.

netket.exact.lanczos_ed

Computes first_n smallest eigenvalues and, optionally, eigenvectors of a Hermitian operator using scipy.sparse.linalg.eigsh().

netket.exact.steady_state

Computes the numerically exact steady-state of a lindblad master equation.

Sampler

Generic API

Those functions can be used to interact with samplers

netket.sampler.sampler_state(sampler, …)

Creates the structure holding the state of the sampler.

netket.sampler.reset(sampler, machine, …)

Resets the state of the sampler.

netket.sampler.sample_next(sampler, machine, …)

Samples the next state in the markov chain.

netket.sampler.sample(sampler, machine, …)

Samples chain_length elements along the chains.

netket.sampler.samples(sampler, machine, …)

Returns a generator sampling chain_length elements along the chains.

List of Samplers

This is a list of all available samplers. Please note that samplers with Numpy in their name are implemented in Numpy and not in pure jax, and they will convert from numpy<->jax at every sampling step the state. If you are using GPUs, this conversion can be very costly. On CPUs, while the conversion is cheap, the dispatch cost of jax is considerate for small systems.

In general those samplers, while they have the same asyntotic cost of Jax samplers, have a much higher overhead for small to moderate (for GPUs) system sizes.

This is because it is not possible to implement all transition rules in Jax.

netket.sampler.Sampler

Abstract base class for all samplers.

netket.sampler.ExactSampler

This sampler generates i.i.d.

netket.sampler.MetropolisSampler

Metropolis-Hastings sampler for an Hilbert space according to a specific transition rule.

netket.sampler.MetropolisSamplerNumpy

Metropolis-Hastings sampler for an Hilbert space according to a specific transition rule executed on CPU through Numpy.

netket.sampler.MetropolisPtSampler

Metropolis-Hastings with Parallel Tempering sampler.

netket.sampler.MetropolisLocal

Sampler acting on one local degree of freedom.

netket.sampler.MetropolisExchange

This sampler acts locally only on two local degree of freedom \(s_i\) and \(s_j\), and proposes a new state: \(s_1 \dots s^\prime_i \dots s^\prime_j \dots s_N\), where in general \(s^\prime_i \neq s_i\) and \(s^\prime_j \neq s_j\).

netket.sampler.MetropolisHamiltonian

Sampling based on the off-diagonal elements of a Hamiltonian (or a generic Operator).

netket.sampler.MetropolisLocalPt

Sampler acting on one local degree of freedom.

netket.sampler.MetropolisExchangePt

This sampler acts locally only on two local degree of freedom \(s_i\) and \(s_j\), and proposes a new state: \(s_1 \dots s^\prime_i \dots s^\prime_j \dots s_N\), where in general \(s^\prime_i \neq s_i\) and \(s^\prime_j \neq s_j\).

netket.sampler.ARDirectSampler

Direct sampler for autoregressive neural networks.

Transition Rules

Those are the transition rules that can be used with the Metropolis Sampler. Rules with Numpy in their name can only be used with netket.sampler.MetropolisSamplerNumpy.

netket.sampler.MetropolisRule(*args[, …])

Base class for Transition rules of Metropolis, such as Local, Exchange, Hamiltonian and several others.

netket.sampler.rules.LocalRule()

A transition rule acting on the local degree of freedom.

netket.sampler.rules.ExchangeRule(*[, …])

A Rule exchanging the state on a random couple of sites, chosen from a list of possible couples (clusters).

netket.sampler.rules.HamiltonianRule(operator)

Rule proposing moves according to the terms in an operator.

netket.sampler.rules.HamiltonianRuleNumpy(…)

Rule for Numpy sampler backend proposing moves according to the terms in an operator.

netket.sampler.rules.CustomRuleNumpy(operator)

Internal State

Those structure hold the state of the sampler.

netket.sampler.SamplerState(*args[, …])

Base class holding the state of a sampler.

netket.sampler.MetropolisSamplerState(*args)

State for a metropolis sampler.

Pre-built models

This sub-module contains several pre-built models to be used as neural quantum states.

netket.models.RBM

A restricted boltzman Machine, equivalent to a 2-layer FFNN with a nonlinear activation function in between.

netket.models.RBMModPhase

A fully connected Restricted Boltzmann Machine (RBM) with real-valued parameters.

netket.models.RBMMultiVal

A fully connected Restricted Boltzmann Machine (see netket.models.RBM) suitable for large local hilbert spaces.

netket.models.RBMSymm

A symmetrized RBM using the netket.nn.DenseSymm layer internally.

netket.models.Jastrow

Jastrow wave function \(\Psi(s) = \exp(\sum_{ij} s_i W_{ij} s_j)\).

netket.models.MPSPeriodic

A periodic Matrix Product State (MPS) for a quantum state of discrete degrees of freedom, wrapped as Jax machine.

netket.models.NDM

Encodes a Positive-Definite Neural Density Matrix using the ansatz from Torlai and Melko, PRL 120, 240503 (2018).

netket.models.GCNN

Implements a Group Convolutional Neural Network (G-CNN) that outputs a wavefunction that is invariant over a specified symmetry group.

netket.models.AbstractARNN

Base class for autoregressive neural networks.

netket.models.ARNNDense

Autoregressive neural network with dense layers.

netket.models.ARNNConv1D

Autoregressive neural network with 1D convolution layers.

netket.models.ARNNConv2D

Autoregressive neural network with 2D convolution layers.

netket.models.FastARNNConv1D

Fast autoregressive neural network with 1D convolution layers.

netket.models.FastARNNConv2D

Fast autoregressive neural network with 2D convolution layers.

Model tools

This sub-module wraps and re-exports flax.nn. Read more about the design goal of this module in their README

netket.nn.Module

Base class for all neural network modules.

Linear Modules

netket.nn.Dense

A linear transformation applied over the last dimension of the input.

netket.nn.DenseGeneral

A linear transformation with flexible axes.

netket.nn.DenseSymm

Implements a projection onto a symmetry group.

netket.nn.DenseEquivariant

A group convolution operation that is equivariant over a symmetry group.

netket.nn.Conv

Convolution Module wrapping lax.conv_general_dilated.

netket.nn.Embed

Embedding Module.

netket.nn.MaskedDense1D

1D linear transformation module with mask for autoregressive NN.

netket.nn.MaskedConv1D

1D convolution module with mask for autoregressive NN.

netket.nn.MaskedConv2D

2D convolution module with mask for autoregressive NN.

Activation functions

celu(x[, alpha])

Continuously-differentiable exponential linear unit activation.

elu(x[, alpha])

Exponential linear unit activation function.

gelu(x[, approximate])

Gaussian error linear unit activation function.

glu(x[, axis])

Gated linear unit activation function.

log_sigmoid(x)

Log-sigmoid activation function.

log_softmax(x[, axis])

Log-Softmax function.

relu(x)

Rectified linear unit activation function.

sigmoid(x)

Sigmoid activation function.

soft_sign(x)

Soft-sign activation function.

softmax(x[, axis])

Softmax function.

softplus(x)

Softplus activation function.

swish(x)

SiLU activation function.

log_cosh(x)

reim_relu(x)

reim_selu(x)

Variational State Interface

netket.vqs.VariationalState

Abstract class for variational states representing either pure states or mixed quantum states.

netket.vqs.MCState

Variational State for a Variational Neural Quantum State.

netket.vqs.MCMixedState

Variational State for a Mixed Variational Neural Quantum State.

Optimizer Module

This module provides some optimisers, implementations of the {ref}`Quantum Geometric Tensor <QGT_and_SR>` and preconditioners such as SR.

Optimizers

Optimizers in NetKet are simple wrappers of optax optimizers. If you want to write a custom optimizer or use more advanced ones, we suggest you have a look at optax documentation.

Check it out for up-to-date informations on available optimisers.

Warning

Even if optimisers in netket.optimizer are optax optimisers, they have slightly different names (they are capitalised) and the argument names have been rearranged and renamed. This was chosen in order not to break our API from previous versions

In general, we advise you to directly use optax, as it is much more powerful, provides more optimisers, and it’s extremely easy to use step-dependent schedulers.

netket.optimizer.Adam

Adam Optimizer.

netket.optimizer.AdaGrad

AdaGrad Optimizer.

netket.optimizer.Sgd

Stochastic Gradient Descent Optimizer.

netket.optimizer.Momentum

Momentum-based Optimizer.

netket.optimizer.RmsProp

RMSProp optimizer.

Preconditioners

This module also provides an implemnetation of the Stochastic Reconfiguration/Natural gradient preconditioner.

netket.optimizer.SR

Construct the structure holding the parameters for using the Stochastic Reconfiguration/Natural gradient method.

Quantum Geometric Tensor

It also provides the following implementation of the quantum geometric tensor:

netket.optimizer.qgt.QGTAuto

Automatically select the ‘best’ Quantum Geometric Tensor computing format acoording to some rather untested heuristic.

netket.optimizer.qgt.QGTOnTheFly

Lazy representation of an S Matrix computed by performing 2 jvp and 1 vjp products, using the variational state’s model, the samples that have already been computed, and the vector.

netket.optimizer.qgt.QGTJacobianPyTree

Semi-lazy representation of an S Matrix where the Jacobian O_k is precomputed and stored as a PyTree.

netket.optimizer.qgt.QGTJacobianDense

Semi-lazy representation of an S Matrix where the Jacobian O_k is precomputed and stored as a dense matrix.

Dense solvers

And the following dense solvers for Stochastic Reconfiguration:

netket.optimizer.solver.svd

Solve the linear system using Singular Value Decomposition.

netket.optimizer.solver.cholesky

netket.optimizer.solver.LU

netket.optimizer.solver.solve

Optimization drivers

Those are the optimization drivers already implmented in Netket:

netket.driver.AbstractVariationalDriver

Abstract base class for NetKet Variational Monte Carlo drivers

netket.driver.VMC

Energy minimization using Variational Monte Carlo (VMC).

netket.driver.SteadyState

Steady-state driver minimizing L^†L.

Logging output

Those are the loggers that can be used with the optimization drivers.

netket.logging.RuntimeLog

Runtim Logger, that can be passed with keyword argument logger to Monte Carlo drivers in order to serialize the outpit data of the simulation.

netket.logging.JsonLog

Json Logger, that can be passed with keyword argument logger to Monte Carlo drivers in order to serialize the outpit data of the simulation.

netket.logging.StateLog

A logger which serializes the variables of the variational state during a run.

netket.logging.TensorBoardLog

Creates a tensorboard logger using tensorboardX’s summarywriter.

Utils

Utility functions and classes.

netket.utils.HashableArray

This class wraps a numpy or jax array in order to make it hashable and equality comparable (which is necessary since a well-defined hashable object needs to satisfy obj1 == obj2 whenever hash(obj1) == hash(obj2).

Callbacks

Those callbacks can be used with the optimisation drivers.

netket.callbacks.EarlyStopping

A simple callback to stop NetKet if there are no more improvements in the training.

netket.callbacks.Timeout

A simple callback to stop NetKet after some time has passed.