Modifying the Log Probability

Turing accumulates log probabilities internally in an internal data structure that is accessible through the internal variable __varinfo__ inside of the model definition. To avoid users having to deal with internal data structures, Turing provides the Turing.@addlogprob! macro which increases the accumulated log probability. For instance, this allows you to include arbitrary terms in the likelihood

using Turing

myloglikelihood(x, μ) = loglikelihood(Normal(μ, 1), x)

@model function demo(x)
    μ ~ Normal()
    Turing.@addlogprob! myloglikelihood(x, μ)
end
demo (generic function with 2 methods)

and to force a sampler to reject a sample:

using Turing
using LinearAlgebra

@model function demo(x)
    m ~ MvNormal(zero(x), I)
    if dot(m, x) < 0
        Turing.@addlogprob! -Inf
        # Exit the model evaluation early
        return nothing
    end

    x ~ MvNormal(m, I)
    return nothing
end
demo (generic function with 2 methods)

Note that @addlogprob! always increases the accumulated log probability, regardless of the provided sampling context. For instance, if you do not want to apply Turing.@addlogprob! when evaluating the prior of your model but only when computing the log likelihood and the log joint probability, then you should check the type of the internal variable __context_, as in the following example:

if DynamicPPL.leafcontext(__context__) !== Turing.PriorContext()
    Turing.@addlogprob! myloglikelihood(x, μ)
end
Back to top