using Pkg
Pkg.add("Turing")
Getting Started
Installation
To use Turing, you need to install Julia first and then install Turing.
Install Julia
You will need to install Julia 1.3 or greater, which you can get from the official Julia website.
Install Turing.jl
Turing is an officially registered Julia package, so you can install a stable version of Turing by running the following in the Julia REPL:
You can check if all tests pass by running Pkg.test("Turing")
(it might take a long time)
Example
Here’s a simple example showing Turing in action.
First, we can load the Turing and StatsPlots modules
using Turing
using StatsPlots
Then, we define a simple Normal model with unknown mean and variance
@model function gdemo(x, y)
~ InverseGamma(2, 3)
s² ~ Normal(0, sqrt(s²))
m ~ Normal(m, sqrt(s²))
x return y ~ Normal(m, sqrt(s²))
end
gdemo (generic function with 2 methods)
Then we can run a sampler to collect results. In this case, it is a Hamiltonian Monte Carlo sampler
= sample(gdemo(1.5, 2), NUTS(), 1000, progress=false) chn
┌ Info: Found initial step size
└ ϵ = 1.6
Chains MCMC chain (1000×14×1 Array{Float64, 3}): Iterations = 501:1:1500 Number of chains = 1 Samples per chain = 1000 Wall duration = 6.58 seconds Compute duration = 6.58 seconds parameters = s², m internals = lp, n_steps, is_accept, acceptance_rate, log_density, hamiltonian_energy, hamiltonian_energy_error, max_hamiltonian_energy_error, tree_depth, numerical_error, step_size, nom_step_size Summary Statistics parameters mean std mcse ess_bulk ess_tail rhat e ⋯ Symbol Float64 Float64 Float64 Float64 Float64 Float64 ⋯ s² 1.9820 1.5319 0.0747 397.7761 511.5426 1.0017 ⋯ m 1.1316 0.8406 0.0330 704.1545 388.9917 1.0036 ⋯ 1 column omitted Quantiles parameters 2.5% 25.0% 50.0% 75.0% 97.5% Symbol Float64 Float64 Float64 Float64 Float64 s² 0.6068 1.1141 1.5865 2.3275 5.8262 m -0.5652 0.6489 1.1367 1.6368 2.8209
We can plot the results
plot(chn)
In this case, because we use the normal-inverse gamma distribution as a conjugate prior, we can compute its updated mean as follows:
= InverseGamma(2, 3)
s² = Normal(0, 1)
m = [1.5, 2]
data = mean(data)
x_bar = length(data)
N
= (m.σ * m.μ + N * x_bar) / (m.σ + N) mean_exp
1.1666666666666667
We can also compute the updated variance
= shape(s²) + (N / 2)
updated_alpha =
updated_beta scale(s²) +
1 / 2) * sum((data[n] - x_bar)^2 for n in 1:N) +
(* m.σ) / (N + m.σ) * ((x_bar)^2) / 2
(N = updated_beta / (updated_alpha - 1) variance_exp
2.0416666666666665
Finally, we can check if these expectations align with our HMC approximations from earlier. We can compute samples from a normal-inverse gamma following the equations given here.
function sample_posterior(alpha, beta, mean, lambda, iterations)
= []
samples for i in 1:iterations
= rand(InverseGamma(alpha, beta), 1)
sample_variance = rand(Normal(mean, sqrt(sample_variance[1]) / lambda), 1)
sample_x = append!(samples, sample_x)
samples end
return samples
end
= sample_posterior(updated_alpha, updated_beta, mean_exp, 2, 1000); analytical_samples
density(analytical_samples; label="Posterior (Analytical)")
density!(chn[:m]; label="Posterior (HMC)")