patternpythonMinor
Small Markov chain Monte Carlo implementation
Viewed 0 times
monteimplementationsmallcarlomarkovchain
Problem
I've written a small markov chain monte carlo function that takes samples from a posterior distribution, based on a prior and a binomial (Bin(N, Z)) distribution.
I'd be happy to have it reviewed, especially perhaps, regarding how to properly pass functions as arguments to functions (as the function
I'd be happy to have it reviewed, especially perhaps, regarding how to properly pass functions as arguments to functions (as the function
prior_dist() in my code). In this case, I'm passing the function uniform_prior_distribution() showed below, but it's quite likely I'd like to pass other functions, that accept slightly different arguments, in the future. This would require me to rewrite mcmc(), unless there's some smart way around it...def mcmc(prior_dist, size=100000, burn=1000, thin=10, Z=3, N=10):
import random
from scipy.stats import binom
#Make Markov chain (Monte Carlo)
mc = [0] #Initialize markov chain
while len(mc) random.random(): #Acceptence criteria
mc.append(cand)
else:
mc.append(mc[-1])
#Take sample
sample = []
for i in range(len(mc)):
if i >= burn and (i-burn)%thin == 0:
sample.append(mc[i])
sample = sorted(sample)
#Estimate posterior probability
post = []
for p in sample:
post.append(binom.pmf(Z, N, p) * prior_dist(p, size))
return sample, post, mc
def uniform_prior_distribution(p, size):
prior = 1.0/size
return priorSolution
def mcmc(prior_dist, size=100000, burn=1000, thin=10, Z=3, N=10):
import random
from scipy.stats import binomDon't import inside functions. It goes against python convention and is inefficient
#Make Markov chain (Monte Carlo)
mc = [0] #Initialize markov chain
while len(mc) random.random(): #Acceptence criteria
mc.append(cand)
else:
mc.append(mc[-1])Why is this in else clause? It only make sense to use else clauses if you are using break.
#Take sample
sample = []
for i in range(len(mc)):
if i >= burn and (i-burn)%thin == 0:
sample.append(mc[i])Actually something like
sample = mc[burn::thin] will have the same effect as this loopsample = sorted(sample)
#Estimate posterior probability
post = []
for p in sample:
post.append(binom.pmf(Z, N, p) * prior_dist(p, size))I'd suggest
post = [binom.pmf(Z, N, p) * prior_dist(p, size) for p in sample]return sample, post, mc
def uniform_prior_distribution(p, size):
prior = 1.0/size
return priorCode Snippets
def mcmc(prior_dist, size=100000, burn=1000, thin=10, Z=3, N=10):
import random
from scipy.stats import binom#Make Markov chain (Monte Carlo)
mc = [0] #Initialize markov chain
while len(mc) < thin*size + burn:
cand = random.gauss(mc[-1], 1) #Propose candidate
ratio = (binom.pmf(Z, N, cand)*prior_dist(cand, size)) / (binom.pmf(Z, N, mc[-1])*prior_dist(mc[-1], size))
if ratio > random.random(): #Acceptence criteria
mc.append(cand)
else:
mc.append(mc[-1])#Take sample
sample = []
for i in range(len(mc)):
if i >= burn and (i-burn)%thin == 0:
sample.append(mc[i])sample = sorted(sample)
#Estimate posterior probability
post = []
for p in sample:
post.append(binom.pmf(Z, N, p) * prior_dist(p, size))return sample, post, mc
def uniform_prior_distribution(p, size):
prior = 1.0/size
return priorContext
StackExchange Code Review Q#10851, answer score: 2
Revisions (0)
No revisions yet.