Markov Chain Monte Carlo (MCMC)
a | b | c | d | e | f | g | h | i | j | k | l | m | n | o | p | q | r | s | t | u | v | w | x | y | z | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
Click one of the letters above to advance the page to terms beginning with that letter.
- Markov Chain Monte Carlo (MCMC)search for term
Algorithm used to sample from a probability distribution, by building a Markov model whose equilibrium distribution is the desired probability distribution. This means that when the chain has been run for a sufficiently long time, each state is visited with a frequency equal to its probability. (Bousseau 2009)