API
Module-wide re-exports
Turing.jl directly re-exports the entire public API of the following packages:
Please see the individual packages for their documentation.
Individual exports and re-exports
In this API documentation, for the sake of clarity, we have listed the module that actually defines each of the exported symbols. Note, however, that all of the following symbols are exported unqualified by Turing. That means, for example, you can just write
using Turing
@model function my_model() end
sample(my_model(), Prior(), 100)
instead of
DynamicPPL.@model function my_model() end
sample(my_model(), Turing.Inference.Prior(), 100)
even though Prior()
is actually defined in the Turing.Inference
module and @model
in the DynamicPPL
package.
Modelling
Exported symbol | Documentation | Description |
---|---|---|
@model | DynamicPPL.@model | Define a probabilistic model |
@varname | AbstractPPL.@varname | Generate a VarName from a Julia expression |
to_submodel | DynamicPPL.to_submodel | Define a submodel |
prefix | DynamicPPL.prefix | Prefix all variable names in a model with a given symbol |
LogDensityFunction | DynamicPPL.LogDensityFunction | A struct containing all information about how to evaluate a model. Mostly for advanced users |
Inference
Exported symbol | Documentation | Description |
---|---|---|
sample | StatsBase.sample | Sample from a model |
MCMCThreads | AbstractMCMC.MCMCThreads | Run MCMC using multiple threads |
MCMCDistributed | AbstractMCMC.MCMCDistributed | Run MCMC using multiple processes |
MCMCSerial | AbstractMCMC.MCMCSerial | Run MCMC using without parallelism |
Samplers
Exported symbol | Documentation | Description |
---|---|---|
Prior | Turing.Inference.Prior | Sample from the prior distribution |
MH | Turing.Inference.MH | Metropolis–Hastings |
Emcee | Turing.Inference.Emcee | Affine-invariant ensemble sampler |
ESS | Turing.Inference.ESS | Elliptical slice sampling |
Gibbs | Turing.Inference.Gibbs | Gibbs sampling |
HMC | Turing.Inference.HMC | Hamiltonian Monte Carlo |
SGLD | Turing.Inference.SGLD | Stochastic gradient Langevin dynamics |
SGHMC | Turing.Inference.SGHMC | Stochastic gradient Hamiltonian Monte Carlo |
PolynomialStepsize | Turing.Inference.PolynomialStepsize | Returns a function which generates polynomially decaying step sizes |
HMCDA | Turing.Inference.HMCDA | Hamiltonian Monte Carlo with dual averaging |
NUTS | Turing.Inference.NUTS | No-U-Turn Sampler |
IS | Turing.Inference.IS | Importance sampling |
SMC | Turing.Inference.SMC | Sequential Monte Carlo |
PG | Turing.Inference.PG | Particle Gibbs |
CSMC | Turing.Inference.CSMC | The same as PG |
RepeatSampler | Turing.Inference.RepeatSampler | A sampler that runs multiple times on the same variable |
externalsampler | Turing.Inference.externalsampler | Wrap an external sampler for use in Turing |
Variational inference
See the variational inference tutorial for a walkthrough on how to use these.
Exported symbol | Documentation | Description |
---|---|---|
vi | AdvancedVI.vi | Perform variational inference |
ADVI | AdvancedVI.ADVI | Construct an instance of a VI algorithm |
Automatic differentiation types
These are used to specify the automatic differentiation backend to use. See the AD guide for more information.
Exported symbol | Documentation | Description |
---|---|---|
AutoForwardDiff | ADTypes.AutoForwardDiff | ForwardDiff.jl backend |
AutoReverseDiff | ADTypes.AutoReverseDiff | ReverseDiff.jl backend |
AutoMooncake | ADTypes.AutoMooncake | Mooncake.jl backend |
Debugging
Turing.setprogress!
— Functionsetprogress!(progress::Bool)
Enable progress logging in Turing if progress
is true
, and disable it otherwise.
Distributions
These distributions are defined in Turing.jl, but not in Distributions.jl.
Turing.Flat
— TypeFlat()
The flat distribution is the improper distribution of real numbers that has the improper probability density function
\[f(x) = 1.\]
Turing.FlatPos
— TypeFlatPos(l::Real)
The positive flat distribution with real-valued parameter l
is the improper distribution of real numbers that has the improper probability density function
\[f(x) = \begin{cases} 0 & \text{if } x \leq l, \\ 1 & \text{otherwise}. \end{cases}\]
Turing.BinomialLogit
— TypeBinomialLogit(n, logitp)
The Binomial distribution with logit parameterization characterizes the number of successes in a sequence of independent trials.
It has two parameters: n
, the number of trials, and logitp
, the logit of the probability of success in an individual trial, with the distribution
\[P(X = k) = {n \choose k}{(\text{logistic}(logitp))}^k (1 - \text{logistic}(logitp))^{n-k}, \quad \text{ for } k = 0,1,2, \ldots, n.\]
See also: Binomial
Turing.OrderedLogistic
— TypeOrderedLogistic(η, c::AbstractVector)
The ordered logistic distribution with real-valued parameter η
and cutpoints c
has the probability mass function
\[P(X = k) = \begin{cases} 1 - \text{logistic}(\eta - c_1) & \text{if } k = 1, \\ \text{logistic}(\eta - c_{k-1}) - \text{logistic}(\eta - c_k) & \text{if } 1 < k < K, \\ \text{logistic}(\eta - c_{K-1}) & \text{if } k = K, \end{cases}\]
where K = length(c) + 1
.
Turing.LogPoisson
— TypeLogPoisson(logλ)
The Poisson distribution with logarithmic parameterization of the rate parameter describes the number of independent events occurring within a unit time interval, given the average rate of occurrence $\exp(\log\lambda)$.
The distribution has the probability mass function
\[P(X = k) = \frac{e^{k \cdot \log\lambda}}{k!} e^{-e^{\log\lambda}}, \quad \text{ for } k = 0,1,2,\ldots.\]
See also: Poisson
Tools to work with distributions
Exported symbol | Documentation | Description |
---|---|---|
I | LinearAlgebra.I | Identity matrix |
filldist | DistributionsAD.filldist | Create a product distribution from a distribution and integers |
arraydist | DistributionsAD.arraydist | Create a product distribution from an array of distributions |
NamedDist | DynamicPPL.NamedDist | A distribution that carries the name of the variable |
Predictions
Exported symbol | Documentation | Description |
---|---|---|
predict | StatsAPI.predict | Generate samples from posterior predictive distribution |
Querying model probabilities and quantities
Please see the generated quantities and probability interface guides for more information.
Exported symbol | Documentation | Description |
---|---|---|
returned | DynamicPPL.returned | Calculate additional quantities defined in a model |
pointwise_loglikelihoods | DynamicPPL.pointwise_loglikelihoods | Compute log likelihoods for each sample in a chain |
logprior | DynamicPPL.logprior | Compute log prior probability |
logjoint | DynamicPPL.logjoint | Compute log joint probability |
condition | AbstractPPL.condition | Condition a model on data |
decondition | AbstractPPL.decondition | Remove conditioning on data |
conditioned | DynamicPPL.conditioned | Return the conditioned values of a model |
fix | DynamicPPL.fix | Fix the value of a variable |
unfix | DynamicPPL.unfix | Unfix the value of a variable |
OrderedDict | OrderedCollections.OrderedDict | An ordered dictionary |
Point estimates
See the mode estimation tutorial for more information.
Exported symbol | Documentation | Description |
---|---|---|
maximum_a_posteriori | Turing.Optimisation.maximum_a_posteriori | Find a MAP estimate for a model |
maximum_likelihood | Turing.Optimisation.maximum_likelihood | Find a MLE estimate for a model |
MAP | Turing.Optimisation.MAP | Type to use with Optim.jl for MAP estimation |
MLE | Turing.Optimisation.MLE | Type to use with Optim.jl for MLE estimation |