Markov Chain Monte Carlo sampling of posterior distribution
NOTE: I recommend using my new GWMCMC sampler which can also be downloaded from the file exchange: http://www.mathworks.com/matlabcentral/fileexchange/49820-the-mcmc-hammer--gwmcmc
Markov Chain Monte Carlo sampling of posterior distribution
A metropolis sampler
[mmc,logP]=mcmc(initialm,loglikelihood,logmodelprior,stepfunction,mccount,skip)
---------
initialm: starting point fopr random walk
loglikelihood: function handle to likelihood function: logL(m)
logprior: function handle to the log model priori probability: logPapriori(m)
stepfunction: function handle with no inputs which returns a random
step in the random walk. (note stepfunction can also be a
matrix describing the size of a normally distributed
step.)
mccount: How long should the markov chain be?
skip: Thin the chain by only storing every N'th step [default=10]
EXAMPLE USAGE: fit a normal distribution to data
-------------------------------------------
data=randn(100,1)*2+3;
logmodelprior=@(m)0; %use a flat prior.
loglike=@(m)sum(log(normpdf(data,m(1),m(2))));
minit=[0 1];
m=mcmc(minit,loglike,logmodelprior,[.2 .5],10000);
m(1:100,:)=[]; %crop drift
plotmatrix(m);
--- Aslak Grinsted 2010
Cita come
Aslak Grinsted (2024). Markov Chain Monte Carlo sampling of posterior distribution (https://www.mathworks.com/matlabcentral/fileexchange/47912-markov-chain-monte-carlo-sampling-of-posterior-distribution), MATLAB Central File Exchange. Recuperato .
Compatibilità della release di MATLAB
Compatibilità della piattaforma
Windows macOS LinuxCategorie
- AI and Statistics > Statistics and Machine Learning Toolbox > Probability Distributions > Pseudorandom and Quasirandom Number Generation >
Tag
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Scopri Live Editor
Crea script con codice, output e testo formattato in un unico documento eseguibile.