Bayesian Statistics is the school of thought that combines prior
beliefs with the likelihood of a hypothesis to arrive at posterior
beliefs. The first edition of Peter Lee's book appeared in
1989, but the subject has moved ever onwards, with increasing
emphasis on Monte Carlo based techniques.
This new fourth edition looks at recent techniques such as
variational methods, Bayesian importance sampling, approximate
Bayesian computation and Reversible Jump Markov Chain Monte Carlo
(RJMCMC), providing a concise account of the way in which the
Bayesian approach to statistics develops as well as how it
contrasts with the conventional approach. The theory is built up
step by step, and important notions such as sufficiency are brought
out of a discussion of the salient features of specific
examples.
This edition:
* Includes expanded coverage of Gibbs sampling, including more
numerical examples and treatments of OpenBUGS, R2WinBUGS and
R2OpenBUGS.
* Presents significant new material on recent techniques such as
Bayesian importance sampling, variational Bayes, Approximate
Bayesian Computation (ABC) and Reversible Jump Markov Chain Monte
Carlo (RJMCMC).
* Provides extensive examples throughout the book to complement
the theory presented.
* Accompanied by a supporting website featuring new material and
solutions.
More and more students are realizing that they need to learn
Bayesian statistics to meet their academic and professional goals.
This book is best suited for use as a main text in courses on
Bayesian statistics for third and fourth year undergraduates and
postgraduate students.
Autorentext
Peter Lee, Department of Mathematics & Formerly Provost of Wentworth College, University of York.
Klappentext
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee's book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques.
This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as well as how it contrasts with the conventional approach. The theory is built up step by step, and important notions such as sufficiency are brought out of a discussion of the salient features of specific examples.
This edition:
- Includes expanded coverage of Gibbs sampling, including more numerical examples and treatments of OpenBUGS, R2WinBUGS and R2OpenBUGS.
- Presents significant new material on recent techniques such as Bayesian importance sampling, variational Bayes, Approximate Bayesian Computation (ABC) and Reversible Jump Markov Chain Monte Carlo (RJMCMC).
- Provides extensive examples throughout the book to complement the theory presented.
- Accompanied by a supporting website featuring new material and solutions.
More and more students are realizing that they need to learn Bayesian statistics to meet their academic and professional goals. This book is best suited for use as a main text in courses on Bayesian statistics for third and fourth year undergraduates and postgraduate students.
Inhalt
Preface xix
Preface to the First Edition xxi
1 Preliminaries 1
1.1 Probability and Bayes' Theorem 1
1.1.1 Notation 1
1.1.2 Axioms for probability 2
1.1.3 'Unconditional' probability 5
1.1.4 Odds 6
1.1.5 Independence 7
1.1.6 Some simple consequences of the axioms; Bayes' Theorem 7
1.2 Examples on Bayes' Theorem 9
1.2.1 The Biology of Twins 9
1.2.2 A political example 10
1.2.3 A warning 10
1.3 Random variables 12
1.3.1 Discrete random variables 12
1.3.2 The binomial distribution 13
1.3.3 Continuous random variables 14
1.3.4 The normal distribution 16
1.3.5 Mixed random variables 17
1.4 Several random variables 17
1.4.1 Two discrete random variables 17
1.4.2 Two continuous random variables 18
1.4.3 Bayes' Theorem for random variables 20
1.4.4 Example 21
1.4.5 One discrete variable and one continuous variable 21
1.4.6 Independent random variables 22
1.5 Means and variances 23
1.5.1 Expectations 23
1.5.2 The expectation of a sum and of a product 24
1.5.3 Variance, precision and standard deviation 25
1.5.4 Examples 25
1.5.5 Variance of a sum; covariance and correlation 27
1.5.6 Approximations to the mean and variance of a function of a random variable 28
1.5.7 Conditional expectations and variances 29
1.5.8 Medians and modes 31
1.6 Exercises on Chapter 1 31
2 Bayesian inference for the normal distribution 36
2.1 Nature of Bayesian inference 36
2.1.1 Preliminary remarks 36
2.1.2 Post is prior times likelihood 36
2.1.3 Likelihood can be multiplied by any constant 38
2.1.4 Sequential use of Bayes' Theorem 38
2.1.5 The predictive distribution 39
2.1.6 A warning 39
2.2 Normal prior and likelihood 40
2.2.1 Posterior from a normal prior and likelihood 40
2.2.2 Example 42
2.2.3 Predictive distribution 43
2.2.4 The nature of the assumptions made 44
2.3 Several normal observations with a normal prior 44
2.3.1 Posterior distribution 44
2.3.2 Example 46
2.3.3 Predictive distribution 47
2.3.4 Robustness 47
2.4 Dominant likelihoods 48
2.4.1 Improper priors 48
2.4.2 Approximation of proper priors by improper priors 49
2.5 Locally uniform priors 50
2.5.1 Bayes' postulate 50
2.5.2 Data translated likelihoods 52
2.5.3 Transformation of unknown parameters 52
2.6 Highest density regions 54
2.6.1 Need for summaries of posterior information 54
2.6.2 Relation to classical statistics 55
2.7 Normal variance 55
2.7.1 A suitable prior for the normal variance 55
2.7.2 Reference prior for the normal variance 58
2.8 HDRs for the normal variance 59
2.8.1 What distribution should we be considering? 59
2.8.2 Example 59
2.9 The role of sufficiency 60
2.9.1 Definition of sufficiency 60
2.9.2 Neyman's factorization theorem 61
2.9.3 Sufficiency principle 63
2.9.4 Examples 63
2.9.5 Order statistics and minimal sufficient statistics 65
2.9.6 Examples on minimal sufficiency 66
2.10 Conjugate prior distributions 67
2.10.1 Definition and difficulties 67
2.10.2 Examples 68
2.10.3 Mixtures of conjugate densities 69
2.10.4 Is your prior really conjugate? 71
2.11 The exponential family 71
2.11.1 Definition 71
2.11.2 Examples 72
2.11.3 Conjugate densities 72
2.11.4 Two-parameter exponential family 73
2.12 Normal mean and variance both unknown 73
2.12.1 Formulation of the problem 73
2.12.2 Marginal distribution of the mean 75
2.12.3 Example of the posterior density for the mean 76
2.12.4 Marginal distribution of the variance 77
2.12.5 Example of the posterior density of the variance 77
2.12.6 Conditional density of the mean for given variance 77
2.13 Conjugate joint prior for the normal distribution 78
2.13.1 The form of the conjugate prior 78
2.13.2 Derivation of the posterior 80
2.13.3 Example 81
2.13.4 Concluding remarks 82
2.14 Exercises on Chapter 2 82
<…