Metropolis-Hastings サンプラー

Julia と Turing エコシステムを用いて

Process
Sampling
Julia
MCMC
Author

司馬 博文

Published

7/03/2024

概要
Julia に存在する Metropolis-Hastings 法と MALA 関連のパッケージの実装と,その使い方をまとめる.

Julia による MCMC パッケージの概観は次の稿も参照:

1 AdvancedMH.jl

Turing をはじめとし,AdvancedMH.jl, AdvancedHMC, MCMCChains, AbstractMCMC などのバージョンが最新版に出来ないことがある.

Soss, PolyaGammaSamplers などとの相性が特に悪く,これらのいずれかがあると,Turing エコシステムのバージョンが著しく制限されるようである.

これらを削除して

(@v1.10) pkg> add Turing@0.33.1

とすれば良い.

1.1

using AdvancedMH
using Distributions
using MCMCChains
using ForwardDiff
using StructArrays
using LinearAlgebra
using LogDensityProblems
using LogDensityProblemsAD

# Define the components of a basic model.
struct LogTargetDensityCauchy
    loc::Float64
    scale::Float64
end

LogDensityProblems.logdensity(p::LogTargetDensityCauchy, θ) = -log(π) - log(p.scale) - log(1 + ((θ[1] - p.loc)/p.scale)^2)
LogDensityProblems.dimension(p::LogTargetDensityCauchy) = 1
LogDensityProblems.capabilities(::Type{LogTargetDensityCauchy}) = LogDensityProblems.LogDensityOrder{0}()

# Use automatic differentiation to compute gradients
model_with_ad = LogDensityProblemsAD.ADgradient(Val(:ForwardDiff), LogTargetDensityCauchy(0.0, 1.0))

# Set up the sampler with a multivariate Gaussian proposal.
σ² = 0.01
spl = MALA(x -> MvNormal((σ² / 2) .* x, σ² * I))
#spl = RWMH(MvNormal(zeros(2), I))

# Sample from the posterior.
chain = sample(model_with_ad, spl, 2000; initial_params=ones(1), chain_type=StructArray, param_names=["θ"])

# plot
θ_vector = chain.θ
plot(θ_vector, title="Plot of \$\\theta\$ values", xlabel="Index", ylabel="θ", legend=false, color="#78C2AD")

2 その他のパッケージ

2.1 AdaptiveMCMC.jl

using Pkg
Pkg.add("AdaptiveMCMC")
# Taken from https://mvihola.github.io/docs/AdaptiveMCMC.jl/

# Load the package
using AdaptiveMCMC

# Define a function which returns log-density values:
log_p(x) = -.5*sum(x.^2)

# Run 10k iterations of the Adaptive Metropolis:
out = adaptive_rwm(zeros(2), log_p, 1_000; algorithm=:am)

using MCMCChains, StatsPlots # Assuming MCMCChains & StatsPlots are installed...
c = Chains(out.X[1,:], start=out.params.b, thin=out.params.thin)
p = plot(c)
p
savefig(p, "AdaptiveMCMC.svg")
"/Users/hirofumi48/162348.github.io/posts/2024/Julia/AdaptiveMCMC.svg"

2.2 KissMCMC.jl

GitHub

using Pkg
Pkg.add("KissMCMC")
using KissMCMC
# the distribution to sample from,
logpdf(x::T) where {T} = x<0 ? -convert(T,Inf) : -x
# initial point of walker
theta0 = 0.5

# Metropolis MCMC sampler:
sample_prop_normal(theta) = 1.5*randn() + theta # samples the proposal (or jump) distribution
thetas, accept_ratio = metropolis(logpdf, sample_prop_normal, theta0, niter=10^5)
println("Accept ratio Metropolis: $accept_ratio")

# emcee MCMC sampler:
thetase, accept_ratioe = emcee(logpdf, make_theta0s(theta0, 0.1, logpdf, 100), niter=10^5)
# check convergence using integrated autocorrelation
thetase, accept_ratioe = squash_walkers(thetase, accept_ratioe) # puts all walkers into one
println("Accept ratio emcee: $accept_ratio")

using Plots
histogram(thetas, normalize=true, fillalpha=0.4)
histogram!(thetase, normalize=true, fillalpha=0.1)
plot!(0:0.01:5, map(x->exp(logpdf(x)[1]), 0:0.01:5), lw=3)
1
where {T<:Any}の略記が入っている.任意の型を変数と取る(危ない)関数logpdfを定義している.convert(T,Inf)は,値Intを型Tに変換している.これは \(\operatorname{Exp}(1)\) の対数尤度を表す.
Accept ratio Metropolis: 0.40872
Accept ratio emcee: 0.40872