NestedSamplers.jl

GitHub Build Status PkgEval Coverage

DOI

A Julian implementation of single- and multi-ellipsoidal nested sampling algorithms using the AbstractMCMC interface.

This package was heavily influenced by nestle, dynesty, and NestedSampling.jl.

Installation

To use the nested samplers first install this library

julia> ]add NestedSamplers

Usage

The samplers are built using the AbstractMCMC interface. To use it, we need to create a NestedModel.

using Random
using AbstractMCMC
AbstractMCMC.setprogress!(false)
Random.seed!(8452);
[ Info: progress logging is disabled globally
using Distributions
using LinearAlgebra
using NestedSamplers
using StatsFuns: logaddexp

# Gaussian mixture model
σ = 0.1
μ1 = ones(2)
μ2 = -ones(2)
inv_σ = diagm(0 => fill(1 / σ^2, 2))

function logl(x)
    dx1 = x .- μ1
    dx2 = x .- μ2
    f1 = -dx1' * (inv_σ * dx1) / 2
    f2 = -dx2' * (inv_σ * dx2) / 2
    return logaddexp(f1, f2)
end
priors = [
    Uniform(-5, 5),
    Uniform(-5, 5)
]
# or equivalently
prior_transform(X) = 10 .* X .- 5
# create the model
# or model = NestedModel(logl, prior_transform)
model = NestedModel(logl, priors);

now, we set up our sampling using StatsBase.

Important: the state of the sampler is returned in addition to the chain by sample.

using StatsBase: sample, Weights

# create our sampler
# 2 parameters, 1000 active points, multi-ellipsoid. See docstring
spl = Nested(2, 1000)
# by default, uses dlogz for convergence. Set the keyword args here
# currently Chains and Array are support chain_types
chain, state = sample(model, spl; dlogz=0.2, param_names=["x", "y"])
# optionally resample the chain using the weights
chain_res = sample(chain, Weights(vec(chain["weights"])), length(chain));
Chains MCMC chain (9360×3×1 Array{Float64, 3}):

Iterations        = 1:9360
Number of chains  = 1
Samples per chain = 9360
parameters        = y, x
internals         = weights

Summary Statistics
  parameters      mean       std   naive_se      mcse         ess      rhat
      Symbol   Float64   Float64    Float64   Float64     Float64   Float64

           x    0.0946    1.0043     0.0104    0.0088   8657.6848    1.0002
           y    0.0935    1.0010     0.0103    0.0089   8680.0644    1.0002

Quantiles
  parameters      2.5%     25.0%     50.0%     75.0%     97.5%
      Symbol   Float64   Float64   Float64   Float64   Float64

           x   -1.1679   -0.9890    0.8681    1.0145    1.1781
           y   -1.1534   -0.9872    0.8573    1.0149    1.1724

let's take a look at the resampled posteriors

using StatsPlots
density(chain_res)
# analytical posterior maxima
vline!([-1, 1], c=:black, ls=:dash, subplot=1)
vline!([-1, 1], c=:black, ls=:dash, subplot=2)

and compare our estimate of the Bayesian (log-)evidence to the analytical value

analytic_logz = log(4π * σ^2 / 100)
# within 2-sigma
@assert isapprox(analytic_logz, state.logz, atol=2state.logzerr)

Contributing

Primary Author: Miles Lucas (@mileslucas)

Contributions are always welcome! Take a look at the issues for ideas of open problems! To discuss ideas or plan contributions, open a discussion.