Julia による MCMC サンプリング
新時代の確率的プログラミング環境の構築に向けて
2024-07-03
Julia と Turing エコシステムを用いて
司馬 博文
7/03/2024
A Blog Entry on Bayesian Computation by an Applied Mathematician
$$
$$
Julia による MCMC パッケージの概観は次の稿も参照:
AdvancedMH.jl
using AdvancedMH
using Distributions
using MCMCChains
using ForwardDiff
using StructArrays
using LinearAlgebra
using LogDensityProblems
using LogDensityProblemsAD
# Define the components of a basic model.
struct LogTargetDensityCauchy
loc::Float64
scale::Float64
end
LogDensityProblems.logdensity(p::LogTargetDensityCauchy, θ) = -log(π) - log(p.scale) - log(1 + ((θ[1] - p.loc)/p.scale)^2)
LogDensityProblems.dimension(p::LogTargetDensityCauchy) = 1
LogDensityProblems.capabilities(::Type{LogTargetDensityCauchy}) = LogDensityProblems.LogDensityOrder{0}()
# Use automatic differentiation to compute gradients
model_with_ad = LogDensityProblemsAD.ADgradient(Val(:ForwardDiff), LogTargetDensityCauchy(0.0, 1.0))
# Set up the sampler with a multivariate Gaussian proposal.
σ² = 0.01
spl = MALA(x -> MvNormal((σ² / 2) .* x, σ² * I))
#spl = RWMH(MvNormal(zeros(2), I))
# Sample from the posterior.
chain = sample(model_with_ad, spl, 2000; initial_params=ones(1), chain_type=StructArray, param_names=["θ"])
# plot
θ_vector = chain.θ
plot(θ_vector, title="Plot of \$\\theta\$ values", xlabel="Index", ylabel="θ", legend=false, color="#78C2AD")
AdaptiveMCMC.jl
# Taken from https://mvihola.github.io/docs/AdaptiveMCMC.jl/
# Load the package
using AdaptiveMCMC
# Define a function which returns log-density values:
log_p(x) = -.5*sum(x.^2)
# Run 10k iterations of the Adaptive Metropolis:
out = adaptive_rwm(zeros(2), log_p, 1_000; algorithm=:am)
using MCMCChains, StatsPlots # Assuming MCMCChains & StatsPlots are installed...
c = Chains(out.X[1,:], start=out.params.b, thin=out.params.thin)
p = plot(c)
p
KissMCMC.jl
using KissMCMC
# the distribution to sample from,
logpdf(x::T) where {T} = x<0 ? -convert(T,Inf) : -x
# initial point of walker
theta0 = 0.5
# Metropolis MCMC sampler:
sample_prop_normal(theta) = 1.5*randn() + theta # samples the proposal (or jump) distribution
thetas, accept_ratio = metropolis(logpdf, sample_prop_normal, theta0, niter=10^5)
println("Accept ratio Metropolis: $accept_ratio")
# emcee MCMC sampler:
thetase, accept_ratioe = emcee(logpdf, make_theta0s(theta0, 0.1, logpdf, 100), niter=10^5)
# check convergence using integrated autocorrelation
thetase, accept_ratioe = squash_walkers(thetase, accept_ratioe) # puts all walkers into one
println("Accept ratio emcee: $accept_ratio")
using Plots
histogram(thetas, normalize=true, fillalpha=0.4)
histogram!(thetase, normalize=true, fillalpha=0.1)
plot!(0:0.01:5, map(x->exp(logpdf(x)[1]), 0:0.01:5), lw=3)
where {T<:Any}
の略記が入っている.任意の型を変数と取る(危ない)関数logpdf
を定義している.convert(T,Inf)
は,値Int
を型T
に変換している.これは \(\operatorname{Exp}(1)\) の対数尤度を表す.
Accept ratio Metropolis: 0.40872
Accept ratio emcee: 0.40872