Markov Chain Monte Carlo (MCMC) methods are now an indispensable
tool in scientific computing. This book discusses recent
developments of MCMC methods with an emphasis on those making use
of past sample information during simulations. The application
examples are drawn from diverse fields such as bioinformatics,
machine learning, social science, combinatorial optimization, and
computational physics.
Key Features:
* Expanded coverage of the stochastic approximation Monte Carlo
and dynamic weighting algorithms that are essentially immune to
local trap problems.
* A detailed discussion of the Monte Carlo Metropolis-Hastings
algorithm that can be used for sampling from distributions with
intractable normalizing constants.
* Up-to-date accounts of recent developments of the Gibbs
sampler.
* Comprehensive overviews of the population-based MCMC algorithms
and the MCMC algorithms with adaptive proposals.
This book can be used as a textbook or a reference book for a
one-semester graduate course in statistics, computational biology,
engineering, and computer sciences. Applied or theoretical
researchers will also find this book beneficial.
Autorentext
Faming Liang, Associate Professor, Department of Statistics, Texas A&M University.
Chuanhai Liu, Professor, Department of Statistics, Purdue University.
Raymond J. Carroll, Distinguished Professor, Department of Statistics, Texas A&M University.
Klappentext
Markov Chain Monte Carlo (MCMC) methods are now an indispensable tool in scientific computing. This book discusses recent developments of MCMC methods with an emphasis on those making use of past sample information during simulations. The application examples are drawn from diverse fields such as bioinformatics, machine learning, social science, combinatorial optimization, and computational physics.
Key Features:
- Expanded coverage of the stochastic approximation Monte Carlo and dynamic weighting algorithms that are essentially immune to local trap problems.
- A detailed discussion of the Monte Carlo Metropolis-Hastings algorithm that can be used for sampling from distributions with intractable normalizing constants.
- Up-to-date accounts of recent developments of the Gibbs sampler.
- Comprehensive overviews of the population-based MCMC algorithms and the MCMC algorithms with adaptive proposals.
- Accompanied by a supporting website featuring datasets used in the book, along with codes used for some simulation examples.
This book can be used as a textbook or a reference book for a one-semester graduate course in statistics, computational biology, engineering, and computer sciences. Applied or theoretical researchers will also find this book beneficial.
Inhalt
Preface xiii
Acknowledgments xvii
Publisher's Acknowledgments xix
1 Bayesian Inference and Markov Chain Monte Carlo 1
1.1 Bayes 1
1.1.1 Specification of Bayesian Models 2
1.1.2 The Jeffreys Priors and Beyond 2
1.2 Bayes Output 4
1.2.1 Credible Intervals and Regions 4
1.2.2 Hypothesis Testing: Bayes Factors 5
1.3 Monte Carlo Integration 8
1.3.1 The Problem 8
1.3.2 Monte Carlo Approximation 9
1.3.3 Monte Carlo via Importance Sampling 9
1.4 Random Variable Generation 10
1.4.1 Direct or Transformation Methods 1
1.4.2 Acceptance-Rejection Methods 11
1.4.3 The Ratio-of-Uniforms Method and Beyond 14
1.4.4 Adaptive Rejection Sampling 18
1.4.5 Perfect Sampling 18
1.5 Markov Chain Monte Carlo 18
1.5.1 Markov Chains 18
1.5.2 Convergence Results 20
1.5.3 Convergence Diagnostics 23
Exercises 24
2 The Gibbs Sampler 27
2.1 The Gibbs Sampler 27
2.2 Data Augmentation 30
2.3 Implementation Strategies and Acceleration Methods 33
2.3.1 Blocking and Collapsing 33
2.3.2 Hierarchical Centering and Reparameterization 34
2.3.3 Parameter Expansion for Data Augmentation 35
2.3.4 Alternating Subspace-Spanning Resampling 43
2.4 Applications 45
2.4.1 The Student-t Model 45
2.4.2 Robit Regression or Binary Regression with the Student-t Link 47
2.4.3 Linear Regression with Interval-Censored Responses 50
Exercises 54
Appendix 2A: The EM and PX-EM Algorithms 56
3 The Metropolis-Hastings Algorithm 59
3.1 The Metropolis-Hastings Algorithm 59
3.1.1 Independence Sampler 62
3.1.2 Random Walk Chains 63
3.1.3 Problems with Metropolis-Hastings Simulations 63
3.2 Variants of the Metropolis-Hastings Algorithm 65
3.2.1 The Hit-and-Run Algorithm. 65
3.2.2 The Langevin Algorithm 65
3.2.3 The Multiple-Try MH Algorithm 66
3.3 Reversible Jump MCMC Algorithm for Bayesian Model Selection Problems 67
3.3.1 Reversible Jump MCMC Algorithm 67
3.3.2 Change-Point Identification 70
3.4 Metropolis-Within-Gibbs Sampler for ChIP-chip Data Analysis 75
3.4.1 Metropolis-Within-Gibbs Sampler 75
3.4.2 Bayesian Analysis for ChIP-chip Data 76
Exercises 83
4 Auxiliary Variable MCMC Methods 85
4.1 Simulated Annealing 86
4.2 Simulated Tempering 88
4.3 The Slice Sampler 90
4.4 The Swendsen-Wang Algorithm 91
4.5 The Wolff Algorithm 93
4.6 The Mo/ller Algorithm 95
4.7 The Exchange Algorithm 97
4.8 The Double MH Sampler 98
4.8.1 Spatial Autologistic Models 99
4.9 Monte Carlo MH Sampler 103
4.9.1 Monte Carlo MH Algorithm 103
4.9.2 Convergence 107
4.9.3 Spatial Autologistic Models (Revisited) 110
4.9.4 Marginal Inference 111
4.10 Applications 113
4.10.1 Autonormal Models 114
4.10.2 Social Networks 116
Exercises 121
5 Population-Based MCMC Methods 123
5.1 Adaptive Direction Sampling 124
5.2 Conjugate Gradient Monte Carlo 125
5.3 Sample Metropolis-Hastings Algorithm 126
5.4 Parallel Tempering 127
5.5 Evolutionary Monte Carlo 128
5.5.1 Evolutionary Monte Carlo in Binary-Coded Space 129
5.5.2 Evolutionary Monte Carlo in Continuous Space 132
5.5.3 Implementation Issues 133
5.5.4 Two Illustrative Examples 134
5.5.5 Discussion 139
5.6 Sequential Parallel Tempering for Simulation of High Dimensional Systems 140
5.6.1 Build-up Ladder Construction 141
5.6.2 Sequential Parallel Tempering 142
5.6.3 An Illustrative Example: the Witch's Hat Distribution...