13.4 Further extensions to MCMC

For the examples in this section we limited the number of iterations to a smaller number to make the results computationally feasible. However we can extend the MCMC approach a few different ways:

  • One approach is to separate the data into two different sets - one for optimization and one for validation. In this approach the ``optimization data’’ consists a certain percentage of the original dataset, leaving the remaining to validate the forward forecasts. This is a type of cross-validation approach, and is generally preferred because you are demonstrating the strength of your model ability against non-optimized data.
  • We also run multiple “chains” of optimization, starting from a different value in parameter space. What we do then after running each of these chains is to select the one with the best log-likelihood value, and run another MCMC iteration starting at that value. They idea is that we have sampled the parameter space and are hopefully starting near an optimum value.

As you can see, the MCMC algorithm is an extremely powerful technique for parameter estimation. While MCMC may take additional time and programming skill to analyze - it is definitely worth it! To learn more about the history of the MCMC algorithm and its other applications see Richey (2010).

References

Richey, Matthew. 2010. “The Evolution of Markov Chain Monte Carlo Methods.” The American Mathematical Monthly 117 (5): 383–413.