# Public API: netket package¶

## Graph¶

 netket.graph.AbstractGraph Abstract class for NetKet graph objects. netket.graph.NetworkX Wrapper for a networkx graph netket.graph.Edgeless Construct a set graph (collection of unconnected vertices). netket.graph.Hypercube A hypercube lattice of side L in d dimensions. netket.graph.Lattice A lattice built translating a unit cell and adding edges between nearest neighbours sites. netket.graph.Chain A chain of L sites. netket.graph.Grid A Grid lattice of d dimensions, and possibly different sizes of each dimension. netket.graph.SymmGroup Collection of symmetry operations acting on the sites of a graph (graph automorphisms).

## Hilbert¶

 netket.hilbert.AbstractHilbert Abstract class for NetKet hilbert objects. netket.hilbert.Qubit Hilbert space obtained as tensor product of local qubit states. netket.hilbert.Spin Hilbert space obtained as tensor product of local spin states. netket.hilbert.CustomHilbert A custom hilbert space with discrete local quantum numbers. netket.hilbert.DoubledHilbert Superoperatorial hilbert space for states living in the tensorised state Hotimes H, encoded according to Choi’s isomorphism.

## Operators¶

 netket.operator.AbstractOperator Abstract class for quantum Operators. netket.operator.BoseHubbard An extended Bose Hubbard model Hamiltonian operator, containing both on-site interactions and nearest-neighboring density-density interactions. netket.operator.GraphOperator A graph-based quantum operator. netket.operator.LocalOperator A custom local operator. netket.operator.Ising The Transverse-Field Ising Hamiltonian $$-h\sum_i \sigma_i^{(x)} +J\sum_{\langle i,j\rangle} \sigma_i^{(z)}\sigma_j^{(z)}$$. netket.operator.Heisenberg The Heisenberg hamiltonian on a lattice. netket.operator.PauliStrings A Hamiltonian consisiting of the sum of products of Pauli operators. netket.operator.LocalLiouvillian LocalLiouvillian super-operator, acting on the DoubledHilbert (tensor product) space ℋ⊗ℋ.

### Pre-defined operators¶

 netket.operator.boson.create Builds the boson creation operator $$\hat{a}^\dagger$$ acting on the site-th of the netket.operator.boson.destroy Builds the boson destruction operator $$\hat{a}$$ acting on the site-th of the netket.operator.boson.number Builds the number operator $$\hat{a}^\dagger\hat{a}$$ acting on the site-th of the Hilbert space hilbert. netket.operator.boson.proj Builds the projector operator $$|n\rangle\langle n |$$ acting on the site-th of the Hilbert space hilbert and collapsing on the state with n bosons. netket.operator.spin.sigmax Builds the $$\sigma^x$$ operator acting on the site-th of the Hilbert space hilbert. netket.operator.spin.sigmay Builds the $$\sigma^y$$ operator acting on the site-th of the Hilbert space hilbert. netket.operator.spin.sigmaz Builds the $$\sigma^z$$ operator acting on the site-th of the Hilbert space hilbert. netket.operator.spin.sigmap Builds the $$\sigma^{+} = \sigma^x + i \sigma^y$$ operator acting on the site-th of the Hilbert space hilbert. netket.operator.spin.sigmam Builds the $$\sigma^{-} = \sigma^x - i \sigma^y$$ operator acting on the site-th of the Hilbert space hilbert.

## Exact solvers¶

 netket.exact.full_ed Computes all eigenvalues and, optionally, eigenvectors of a Hermitian operator by full diagonalization. netket.exact.lanczos_ed Computes first_n smallest eigenvalues and, optionally, eigenvectors of a Hermitian operator using scipy.sparse.linalg.eigsh. netket.exact.steady_state Computes the numerically exact steady-state of a lindblad master equation.

## Sampler¶

### Generic API¶

Those functions can be used to interact with samplers

 netket.sampler.sampler_state(sampler, …) Creates the structure holding the state of the sampler. netket.sampler.reset(sampler, machine, …) Resets the state of the sampler. netket.sampler.sample_next(sampler, machine, …) Samples the next state in the markov chain. netket.sampler.sample(sampler, machine, …) Samples chain_length elements along the chains. netket.sampler.samples(sampler, machine, …) Returns a generator sampling chain_length elements along the chains.

### List of Samplers¶

This is a list of all available samplers. Please note that samplers with Numpy in their name are implemented in Numpy and not in pure jax, and they will convert from numpy<->jax at every sampling step the state. If you are using GPUs, this conversion can be very costly. On CPUs, while the conversion is cheap, the dispatch cost of jax is considerate for small systems.

In general those samplers, while they have the same asyntotic cost of Jax samplers, have a much higher overhead for small to moderate (for GPUs) system sizes.

This is because it is not possible to implement all transition rules in Jax.

 netket.sampler.Sampler Abstract base class for all samplers. netket.sampler.ExactSampler This sampler generates i.i.d. netket.sampler.MetropolisSampler Metropolis-Hastings sampler for an Hilbert space according to a specific transition rule. netket.sampler.MetropolisSamplerNumpy Metropolis-Hastings sampler for an Hilbert space according to a specific transition rule executed on CPU through Numpy. netket.sampler.MetropolisPtSampler Metropolis-Hastings with Parallel Tempering sampler. netket.sampler.MetropolisLocal Sampler acting on one local degree of freedom. netket.sampler.MetropolisExchange This sampler acts locally only on two local degree of freedom $$s_i$$ and $$s_j$$, and proposes a new state: $$s_1 \dots s^\prime_i \dots s^\prime_j \dots s_N$$, where in general $$s^\prime_i \neq s_i$$ and $$s^\prime_j \neq s_j$$. netket.sampler.MetropolisHamiltonian Sampling based on the off-diagonal elements of a Hamiltonian (or a generic Operator). netket.sampler.MetropolisLocalPt Sampler acting on one local degree of freedom. netket.sampler.MetropolisExchangePt This sampler acts locally only on two local degree of freedom $$s_i$$ and $$s_j$$, and proposes a new state: $$s_1 \dots s^\prime_i \dots s^\prime_j \dots s_N$$, where in general $$s^\prime_i \neq s_i$$ and $$s^\prime_j \neq s_j$$.

### Transition Rules¶

Those are the transition rules that can be used with the Metropolis Sampler. Rules with Numpy in their name can only be used with netket.sampler.MetropolisSamplerNumpy.

 Base class for Transition rules of Metropolis, such as Local, Exchange, Hamiltonian and several others. A transition rule acting on the local degree of freedom. A Rule exchanging the state on a random couple of sites, chosen from a list of possible couples (clusters). Rule proposing moves according to the terms in an operator. Rule for Numpy sampler backend proposing moves according to the terms in an operator.

### Internal State¶

Those structure hold the state of the sampler.

 Base class holding the state of a sampler. State for a metropolis sampler.

## Pre-built models¶

This sub-module contains several pre-built models to be used as neural quantum states.

 netket.models.RBM A restricted boltzman Machine, equivalent to a 2-layer FFNN with a nonlinear activation function in between. netket.models.RBMModPhase A fully connected Restricted Boltzmann Machine (RBM) with real-valued parameters. netket.models.RBMMultiVal A fully connected Restricted Boltzmann Machine (see netket.models.RBM) suitable for large local hilbert spaces. netket.models.RBMSymm A symmetrized RBM using the netket.nn.DenseSymm layer internally. netket.models.Jastrow Jastrow wave function $$\Psi(s) = \exp(\sum_{ij} s_i W_{ij} s_j)$$. netket.models.MPSPeriodic A periodic Matrix Product State (MPS) for a quantum state of discrete degrees of freedom, wrapped as Jax machine. netket.models.NDM Encodes a Positive-Definite Neural Density Matrix using the ansatz from Torlai and Melko, PRL 120, 240503 (2018). netket.models.GCNN Implements a Group Convolutional Neural Network (G-CNN) that outputs a wavefunction that is invariant over a specified symmetry group.

## Model tools¶

This sub-module wraps and re-exports flax.nn. Read more about the design goal of this module in their README

 netket.nn.Module Base class for all neural network modules.

### Linear Modules¶

 netket.nn.Dense A linear transformation applied over the last dimension of the input. netket.nn.DenseGeneral A linear transformation with flexible axes. netket.nn.DenseSymm A symmetrized linear transformation applied over the last dimension of the input. netket.nn.DenseEquivariant Implements a G-convolution that acts on a feature map of symmetry poses of shape [batch_size,n_symm*in_features] and returns a feature map of poses of shape [batch_size,n_symm*out_features] netket.nn.Conv Convolution Module wrapping lax.conv_general_dilated. netket.nn.Embed Embedding Module.

### Activation functions¶

 celu(x[, alpha]) Continuously-differentiable exponential linear unit activation. elu(x[, alpha]) Exponential linear unit activation function. gelu(x[, approximate]) Gaussian error linear unit activation function. glu(x[, axis]) Gated linear unit activation function. Log-sigmoid activation function. log_softmax(x[, axis]) Log-Softmax function. Rectified linear unit activation function. Sigmoid activation function. Soft-sign activation function. softmax(x[, axis]) Softmax function. Softplus activation function. SiLU activation function.

## Variational State Interface¶

 netket.variational.VariationalState Abstract class for variational states representing either pure states or mixed quantum states. netket.variational.MCState Variational State for a Variational Neural Quantum State. netket.variational.MCMixedState Variational State for a Mixed Variational Neural Quantum State.

## Optimizer¶

This module provides the following functionalities

 netket.optimizer.SR Construct the structure holding the parameters for using the Stochastic Reconfiguration/Natural gradient method. netket.optimizer.SRLazyCG Computes x = ⟨S⟩⁻¹⟨F⟩ by using an iterative conjugate gradient method. netket.optimizer.SRLazyGMRES Computes x = ⟨S⟩⁻¹⟨F⟩ by using an iterative GMRES method. netket.optimizer.sr.LazySMatrix Lazy representation of an S Matrix behving like a linear operator.

This module also provides some optimisers from optax. Check it out for up-to-date informations on available optimisers.

Warning

Even if optimisers in netket.optimizer are optax optimisers, they have slightly different names (they are capitalised) and the argument names have been rearranged and renamed. This was chosen in order not to break our API from previous versions

In general, we advise you to directly use optax, as it is much more powerful, provides more optimisers, and it’s extremely easy to use step-dependent schedulers.

 netket.optimizer.Adam Adam Optimizer. netket.optimizer.AdaGrad AdaGrad Optimizer. netket.optimizer.Sgd Stochastic Gradient Descent Optimizer. netket.optimizer.Momentum Momentum-based Optimizer. netket.optimizer.RmsProp RMSProp optimizer.

## Optimization drivers¶

Those are the optimization drivers already implmented in Netket:

 netket.driver.AbstractVariationalDriver Abstract base class for NetKet Variational Monte Carlo drivers netket.driver.VMC Energy minimization using Variational Monte Carlo (VMC). netket.driver.SteadyState Steady-state driver minimizing L^†L.

## Logging output¶

Those are the loggers that can be used with the optimization drivers.

 netket.logging.JsonLog Json Logger, that can be passed with keyword argument logger to Monte Carlo drivers in order to serialize the outpit data of the simulation. netket.logging.RuntimeLog Runtim Logger, that can be passed with keyword argument logger to Monte Carlo drivers in order to serialize the outpit data of the simulation. netket.logging.TBLog Creates a tensorboard logger using tensorboardX’s summarywriter.

## Utils¶

Utility functions and classes.

 netket.utils.HashableArray This class wraps a numpy or jax array in order to make it hashable and equality comparable (which is necessary since a well-defined hashable object needs to satisfy obj1 == obj2 whenever hash(obj1) == hash(obj2).

## Callbacks¶

Those callbacks can be used with the optimisation drivers.

 netket.callbacks.EarlyStopping A simple callback to stop NetKet if there are no more improvements in the training. netket.callbacks.Timeout A simple callback to stop NetKet after some time has passed.