• Get Started
  • Tutorials
  • FAQ
  • Libraries
  • News
  • Team
Turing Logo Turing Logo

Bayesian inference with probabilistic programming

Expressive

Turing models are easy to write and communicate, with syntax that is close to the mathematical specification of the model.

General-purpose

Turing supports models with discrete parameters and stochastic control flow.

Composable

Turing is written entirely in Julia, and is interoperable with its powerful ecosystem.

Get Started
Model Condition Prior Inference Analysis
using Turing

@model function linear_regression(x)
    # Priors
    α ~ Normal(0, 1)
    β ~ Normal(0, 1)
    σ² ~ truncated(Cauchy(0, 3); lower=0)

    # Likelihood
    μ = α .+ β .* x
    y ~ MvNormal(μ, σ² * I)
end
# Data
x_data = rand(10)
y_data = rand(10)

# Condition model on data
model = linear_regression(x_data) | (; y = y_data)
# Sample from the prior
prior = sample(model, Prior(), 100)
# Run inference using NUTS
chain = sample(model, NUTS(), 1000)
# Analyze the posterior
describe(chain)
plot(chain)

The Bayesian Workflow

From model definition to posterior analysis, Turing.jl provides a seamless experience.

Mathematical Specification

\[ \begin{align*} \sigma^2 &\sim \text{InverseGamma}(3, 4/10) \\ \gamma &\sim \mathcal{N}(0, \sqrt{10}) \\ \beta &\sim \mathcal{N}(0, I) \\ y_i &\sim \log \mathcal{N}(\beta \cdot x_i + \gamma, \sigma) \end{align*} \]

linear_regression.jl

@model function linear_regression(x)
    d = size(x, 1)
    
    # Priors
    σ² ~ InverseGamma(3, 4 / 10)
    γ ~ Normal(0, √10)
    β ~ MvNormal(zeros(d), I)
    
    # Likelihood
    y ~ MvLogNormal(x' * β .+ γ, σ² * I) 
end

model = linear_regression(x_data)

Prior Checks

model_gen = fix(model, params_gen)
(; y) = rand(model_gen)
chain_gen = sample(model | (y = y,), NUTS(), 1000)

Condition

model_conditioned = model | (y = y_data,)

Determinability

Determinability Plot

Inference

chain = sample(model_conditioned, NUTS(), 1000)

Posterior Analysis

Summary Statistics
parameters    mean      std      mcse
σ²            0.0543    0.0032   0.0000
γ             7.7252    0.3688   0.0066
β[1]          -0.2270   0.1247   0.0022
β[2]          0.0133    0.1229   0.0022

Posterior Density

Posterior Predictions

Posterior Prediction

Start Your Journey

Whether you’re new to Bayesian modeling or an experienced researcher, find the resources you need.

Research Papers on Turing.jl →
New to Turing?

Begin with the basics. Our step-by-step tutorials will guide you from installation to your first probabilistic models.

Get Started Beginner's Walkthrough
For Researchers

Dive into advanced models, explore the rich package ecosystem, and learn how to cite Turing.jl in your work.

Explore Ecosystem Cite Turing.jl
For Developers

Join our community, contribute to the project on GitHub, and connect with fellow developers on Slack.

View on GitHub Join the Slack Channel
Research Papers on Turing.jl →

Core Packages

The Turing ecosystem is built on a foundation of powerful, composable packages.

Explore Ecosystem →
DynamicPPL.jl

A domain-specific language and backend for probabilistic programming languages, used by Turing.jl.

JuliaBUGS.jl

A modern implementation of the BUGS probabilistic programming language in Julia.

TuringGLM.jl

Bayesian Generalized Linear models using @formula syntax and returns an instantiated Turing model.

AdvancedHMC.jl

A robust, modular and efficient implementation of advanced HMC algorithms. (abs, pdf)

Explore Ecosystem →

News & Updates

Read the latest from the Turing team.

View more →
Turing.jl Newsletter 15

The fortnightly newsletter for the Turing.jl probabilistic programming language

The TuringLang team Nov 7, 2025
1 min
Turing.jl Newsletter 14

The fortnightly newsletter for the Turing.jl probabilistic programming language

The TuringLang team Oct 24, 2025
2 min
Turing.jl Newsletter 13

The fortnightly newsletter for the Turing.jl probabilistic programming language

The TuringLang team Oct 10, 2025
1 min
Turing.jl Newsletter 12

The fortnightly newsletter for the Turing.jl probabilistic programming language

The TuringLang team Sep 26, 2025
2 min
GSoC Report for DoodleBUGS: a Browser-Based Graphical Interface for Drawing Probabilistic Graphical Models

Shravan Goswami's GSoC 2025 final report: goals, architecture, progress vs proposal, and how to try it.

Shravan Goswami Sep 1, 2025
10 min
Turing.jl Newsletter 11

The fortnightly newsletter for the Turing.jl probabilistic programming language

The TuringLang team Jul 25, 2025
2 min
No matching items
View more →

Featured Tutorials

A selection of tutorials to get you started.

View all tutorials →
Get Started with Turing.jl

Our step-by-step tutorials will guide you from installation to your first probabilistic models.

Basics Getting Started
Introduction: Coin Flipping

Learn the basic concepts of Bayesian modeling by working through a simple coin-flipping example.

Basics Modeling
Core Functionality

This article provides an overview of the core functionality in Turing.jl, which are likely to be used across a wide range of models.

Basics Features
No matching items
View all tutorials →

Turing.jl is an MIT Licensed Open Source Project

If you use Turing.jl in your research, please consider citing our papers.

Explore
Get Started Tutorials FAQ Libraries News Team
Connect
GitHub Twitter Slack Discourse
Supported by leading researchers

Turing.jl is developed by researchers and engineers at the following research institutions.

University of Cambridge Logo University of Cambridge Logo Dark The Alan Turing Institute Logo The Alan Turing Institute Logo Dark

Turing is created by Hong Ge, and maintained by the core team of developers and contributors.
© 2025 The Turing Project Contributors. MIT License.

Website Source