API

AbstractMCMC defines an interface for sampling Markov chains.

Model

AbstractMCMC.LogDensityModelType
LogDensityModel <: AbstractMCMC.AbstractModel

Wrapper around something that implements the LogDensityProblem.jl interface.

Note that this does not implement the LogDensityProblems.jl interface itself, but it simply useful for indicating to the sample and other AbstractMCMC methods that the wrapped object implements the LogDensityProblems.jl interface.

Fields

  • logdensity: The object that implements the LogDensityProblems.jl interface.
source

Sampler

AbstractMCMC.AbstractSamplerType
AbstractSampler

The AbstractSampler type is intended to be inherited from when implementing a custom sampler. Any persistent state information should be saved in a subtype of AbstractSampler.

When defining a new sampler, you should also overload the function transition_type, which tells the sample function what type of parameter it should expect to receive.

source

Sampling a single chain

StatsBase.sampleMethod
sample(
    rng::Random.AbatractRNG=Random.default_rng(),
    model::AbstractModel,
    sampler::AbstractSampler,
    N_or_isdone;
    kwargs...,
)

Sample from the model with the Markov chain Monte Carlo sampler and return the samples.

If N_or_isdone is an Integer, exactly N_or_isdone samples are returned.

Otherwise, sampling is performed until a convergence criterion N_or_isdone returns true. The convergence criterion has to be a function with the signature

isdone(rng, model, sampler, samples, state, iteration; kwargs...)

where state and iteration are the current state and iteration of the sampler, respectively. It should return true when sampling should end, and false otherwise.

Keyword arguments

See https://turinglang.org/AbstractMCMC.jl/dev/api/#Common-keyword-arguments for common keyword arguments.

source
StatsBase.sampleMethod
sample(
    rng::Random.AbstractRNG=Random.default_rng(),
    logdensity,
    sampler::AbstractSampler,
    N_or_isdone;
    kwargs...,
)

Wrap the logdensity function in a LogDensityModel, and call sample with the resulting model instead of logdensity.

The logdensity function has to support the LogDensityProblems.jl interface.

source

Iterator

AbstractMCMC.stepsMethod
steps(
    rng::Random.AbstractRNG=Random.default_rng(),
    model::AbstractModel,
    sampler::AbstractSampler;
    kwargs...,
)

Create an iterator that returns samples from the model with the Markov chain Monte Carlo sampler.

Examples

julia> struct MyModel <: AbstractMCMC.AbstractModel end

julia> struct MySampler <: AbstractMCMC.AbstractSampler end

julia> function AbstractMCMC.step(rng, ::MyModel, ::MySampler, state=nothing; kwargs...)
           # all samples are zero
           return 0.0, state
       end

julia> iterator = steps(MyModel(), MySampler());

julia> collect(Iterators.take(iterator, 10)) == zeros(10)
true
source
AbstractMCMC.stepsMethod
steps(
    rng::Random.AbstractRNG=Random.default_rng(),
    logdensity,
    sampler::AbstractSampler;
    kwargs...,
)

Wrap the logdensity function in a LogDensityModel, and call steps with the resulting model instead of logdensity.

The logdensity function has to support the LogDensityProblems.jl interface.

source

Transducer

AbstractMCMC.SampleMethod
Sample(
    rng::Random.AbstractRNG=Random.default_rng(),
    model::AbstractModel,
    sampler::AbstractSampler;
    kwargs...,
)

Create a transducer that returns samples from the model with the Markov chain Monte Carlo sampler.

Examples

julia> struct MyModel <: AbstractMCMC.AbstractModel end

julia> struct MySampler <: AbstractMCMC.AbstractSampler end

julia> function AbstractMCMC.step(rng, ::MyModel, ::MySampler, state=nothing; kwargs...)
           # all samples are zero
           return 0.0, state
       end

julia> transducer = Sample(MyModel(), MySampler());

julia> collect(transducer(1:10)) == zeros(10)
true
source
AbstractMCMC.SampleMethod
Sample(
    rng::Random.AbstractRNG=Random.default_rng(),
    logdensity,
    sampler::AbstractSampler;
    kwargs...,
)

Wrap the logdensity function in a LogDensityModel, and call Sample with the resulting model instead of logdensity.

The logdensity function has to support the LogDensityProblems.jl interface.

source

Sampling multiple chains in parallel

StatsBase.sampleMethod
sample(
    rng::Random.AbstractRNG=Random.default_rng(),
    model::AbstractModel,
    sampler::AbstractSampler,
    parallel::AbstractMCMCEnsemble,
    N::Integer,
    nchains::Integer;
    kwargs...,
)

Sample nchains Monte Carlo Markov chains from the model with the sampler in parallel using the parallel algorithm, and combine them into a single chain.

Keyword arguments

See https://turinglang.org/AbstractMCMC.jl/dev/api/#Common-keyword-arguments for common keyword arguments.

source
StatsBase.sampleMethod
sample(
    rng::Random.AbstractRNG=Random.default_rng(),
    logdensity,
    sampler::AbstractSampler,
    parallel::AbstractMCMCEnsemble,
    N::Integer,
    nchains::Integer;
    kwargs...,
)

Wrap the logdensity function in a LogDensityModel, and call sample with the resulting model instead of logdensity.

The logdensity function has to support the LogDensityProblems.jl interface.

source

Two algorithms are provided for parallel sampling with multiple threads and multiple processes, and one allows for the user to sample multiple chains in serial (no parallelization):

Common keyword arguments

Common keyword arguments for regular and parallel sampling are:

  • progress (default: AbstractMCMC.PROGRESS[] which is true initially): toggles progress logging
  • chain_type (default: Any): determines the type of the returned chain
  • callback (default: nothing): if callback !== nothing, then callback(rng, model, sampler, sample, iteration) is called after every sampling step, where sample is the most recent sample of the Markov chain and iteration is the current iteration
  • num_warmup (default: 0): number of "warm-up" steps to take before the first "regular" step, i.e. number of times to call AbstractMCMC.step_warmup before the first call to AbstractMCMC.step.
  • discard_initial (default: num_warmup): number of initial samples that are discarded. Note that if discard_initial < num_warmup, warm-up samples will also be included in the resulting samples.
  • thinning (default: 1): factor by which to thin samples.
  • initial_state (default: nothing): if initial_state !== nothing, the first call to AbstractMCMC.step is passed initial_state as the state argument.
Info

The common keyword arguments progress, chain_type, and callback are not supported by the iterator AbstractMCMC.steps and the transducer AbstractMCMC.Sample.

There is no "official" way for providing initial parameter values yet. However, multiple packages such as EllipticalSliceSampling.jl and AdvancedMH.jl support an initial_params keyword argument for setting the initial values when sampling a single chain. To ensure that sampling multiple chains "just works" when sampling of a single chain is implemented, we decided to support initial_params in the default implementations of the ensemble methods:

  • initial_params (default: nothing): if initial_params isa AbstractArray, then the ith element of initial_params is used as initial parameters of the ith chain. If one wants to use the same initial parameters x for every chain, one can specify e.g. initial_params = FillArrays.Fill(x, N).

Progress logging can be enabled and disabled globally with AbstractMCMC.setprogress!(progress).

AbstractMCMC.setprogress!Function
setprogress!(progress::Bool; silent::Bool=false)

Enable progress logging globally if progress is true, and disable it otherwise. Optionally disable informational message if silent is true.

source

Chains

The chain_type keyword argument allows to set the type of the returned chain. A common choice is to return chains of type Chains from MCMCChains.jl.

AbstractMCMC defines the abstract type AbstractChains for Markov chains.

AbstractMCMC.AbstractChainsType
AbstractChains

AbstractChains is an abstract type for an object that stores parameter samples generated through a MCMC process.

source

For chains of this type, AbstractMCMC defines the following two methods.

AbstractMCMC.chainscatFunction
chainscat(c::AbstractChains...)

Concatenate multiple chains.

By default, the chains are concatenated along the third dimension by calling cat(c...; dims=3).

source
AbstractMCMC.chainsstackFunction
chainsstack(c::AbstractVector)

Stack chains in c.

By default, the vector of chains is returned unmodified. If eltype(c) <: AbstractChains, then reduce(chainscat, c) is called.

source

Interacting with states of samplers

To make it a bit easier to interact with some arbitrary sampler state, we encourage implementations of AbstractSampler to implement the following methods:

AbstractMCMC.getparamsFunction
getparams(state[; kwargs...])

Retrieve the values of parameters from the sampler's state as a Vector{<:Real}.

source
AbstractMCMC.setparams!!Function
setparams!!(state, params)

Set the values of parameters in the sampler's state from a Vector{<:Real}.

This function should follow the BangBang interface: mutate state in-place if possible and return the mutated state. Otherwise, it should return a new state containing the updated parameters.

Although not enforced, it should hold that setparams!!(state, getparams(state)) == state. In another word, the sampler should implement a consistent transformation between its internal representation and the vector representation of the parameter values.

source

These methods can also be useful for implementing samplers which wraps some inner samplers, e.g. a mixture of samplers.

Example: MixtureSampler

In a MixtureSampler we need two things:

  • components: collection of samplers.
  • weights: collection of weights representing the probability of choosing the corresponding sampler.
struct MixtureSampler{W,C} <: AbstractMCMC.AbstractSampler
    components::C
    weights::W
end

To implement the state, we need to keep track of a couple of things:

  • index: the index of the sampler used in this step.
  • states: the current states of all the components.

We need to keep track of the states of all components rather than just the state for the sampler we used previously. The reason is that lots of samplers keep track of more than just the previous realizations of the variables, e.g. in AdvancedHMC.jl we keep track of the momentum used, the metric used, etc.

struct MixtureState{S}
    index::Int
    states::S
end

The step for a MixtureSampler is defined by the following generative process

\[\begin{aligned} i &\sim \mathrm{Categorical}(w_1, \dots, w_k) \\ X_t &\sim \mathcal{K}_i(\cdot \mid X_{t - 1}) \end{aligned}\]

where $\mathcal{K}_i$ denotes the i-th kernel/sampler, and $w_i$ denotes the weight/probability of choosing the i-th sampler. AbstractMCMC.getparams and AbstractMCMC.setparams!! comes into play in defining/computing $\mathcal{K}_i(\cdot \mid X_{t - 1})$ since $X_{t - 1}$ could be coming from a different sampler.

If we let state be the current MixtureState, i the current component, and i_prev is the previous component we sampled from, then this translates into the following piece of code:

# Update the corresponding state, i.e. `state.states[i]`, using
# the state and transition from the previous iteration.
state_current = AbstractMCMC.setparams!!(
    state.states[i], 
    AbstractMCMC.getparams(state.states[i_prev]),
)

# Take a `step` for this sampler using the updated state.
transition, state_current = AbstractMCMC.step(
    rng, model, sampler_current, sampler_state;
    kwargs...
)

The full AbstractMCMC.step implementation would then be something like:

function AbstractMCMC.step(rng, model::AbstractMCMC.AbstractModel, sampler::MixtureSampler, state; kwargs...)
    # Sample the component to use in this `step`.
    i = rand(Categorical(sampler.weights))
    sampler_current = sampler.components[i]

    # Update the corresponding state, i.e. `state.states[i]`, using
    # the state and transition from the previous iteration.
    i_prev = state.index
    state_current = AbstractMCMC.setparams!!(  
        state.states[i], 
        AbstractMCMC.getparams(state.states[i_prev]),  
    )

    # Take a `step` for this sampler using the updated state.
    transition, state_current = AbstractMCMC.step(
        rng, model, sampler_current, state_current;
        kwargs...
    )

    # Create the new states.
    # NOTE: Code below will result in `states_new` being a `Vector`.
    # If we wanted to allow usage of alternative containers, e.g. `Tuple`,
    # it would be better to use something like `@set states[i] = state_current`
    # where `@set` is from Setfield.jl.
    states_new = map(1:length(state.states)) do j
        if j == i
            # Replace the i-th state with the new one.
            state_current
        else
            # Otherwise we just carry over the previous ones.
            state.states[j]
        end
    end

    # Create the new `MixtureState`.
    state_new = MixtureState(i, states_new)

    return transition, state_new
end

And for the initial AbstractMCMC.step we have:

function AbstractMCMC.step(rng, model::AbstractMCMC.AbstractModel, sampler::MixtureSampler; kwargs...)
    # Initialize every state.
    transitions_and_states = map(sampler.components) do spl
        AbstractMCMC.step(rng, model, spl; kwargs...)
    end

    # Sample the component to use this `step`.
    i = rand(Categorical(sampler.weights))
    # Extract the corresponding transition.
    transition = first(transitions_and_states[i])
    # Extract states.
    states = map(last, transitions_and_states)
    # Create new `MixtureState`.
    state = MixtureState(i, states)

    return transition, state
end

Suppose we then wanted to use this with some of the packages which implements AbstractMCMC.jl's interface, e.g. AdvancedMH.jl, then we'd simply have to implement getparams and setparams!!.

To use MixtureSampler with two samplers sampler1 and sampler2 from AdvancedMH.jl as components, we'd simply do

sampler = MixtureSampler([sampler1, sampler2], [0.1, 0.9])
transition, state = AbstractMCMC.step(rng, model, sampler)
while ...
    transition, state = AbstractMCMC.step(rng, model, sampler, state)
end