Access the full text.
Sign up today, get DeepDyve free for 14 days.
BOOK REVIEWS 431 chapter for readers who do not have strong backgrounds in stochastic pro- tion and covariance matrices. This chapter will be of particular interest cess. A less advanced audience could skip this chapter for the first time. to practitioners because it focuses on clear algorithms and efficient meth- Chapters 5 (“Monte Carlo Optimization”), 6 (“The Metropolis-Hastings ods, with more complicated algorithms and specialized methods given as Algorithm”), and 7 (“The Gibbs Sampler”) are the central parts of MCMC references only. methods, which many students, practitioners, and researchers are attempt- Chapter 4 discusses the generation of random samples, both with and ing to learn about. Unlike most simulation books, authors introduce the without replacement and with fixed and random sample sizes, and the Metropolis-Hastings algorithm first and then treat the Gibbs sampler as a generation of permutations of datasets. It provides an excellent discussion special case. I think this order is more “logical” than the other way round. of why these methods strain the limits of random number generators with The authors did a fine job in separating Chapters 8 (“Diagnosing Con- finite periods (see p. 124). Finally, it discusses some special topics in the vergence”) and 9
Technometrics – Taylor & Francis
Published: Nov 1, 2000
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.