Bayesian Statistics: An Introduction [NOOK Book]


Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques.

This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo ...

See more details below
Bayesian Statistics: An Introduction

Available on NOOK devices and apps  
  • NOOK Devices
  • Samsung Galaxy Tab 4 NOOK 7.0
  • Samsung Galaxy Tab 4 NOOK 10.1
  • NOOK HD Tablet
  • NOOK HD+ Tablet
  • NOOK eReaders
  • NOOK Color
  • NOOK Tablet
  • Tablet/Phone
  • NOOK for Windows 8 Tablet
  • NOOK for iOS
  • NOOK for Android
  • NOOK Kids for iPad
  • PC/Mac
  • NOOK for Windows 8
  • NOOK for PC
  • NOOK for Mac
  • NOOK for Web

Want a NOOK? Explore Now

NOOK Book (eBook)
$35.49 price
(Save 42%)$61.95 List Price
Note: This NOOK Book can be purchased in bulk. Please email us for more information.


Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques.

This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as well as how it contrasts with the conventional approach. The theory is built up step by step, and important notions such as sufficiency are brought out of a discussion of the salient features of specific examples.

This edition:

  • Includes expanded coverage of Gibbs sampling, including more numerical examples and treatments of OpenBUGS, R2WinBUGS and R2OpenBUGS.
  • Presents significant new material on recent techniques such as Bayesian importance sampling, variational Bayes, Approximate Bayesian Computation (ABC) and Reversible Jump Markov Chain Monte Carlo (RJMCMC).
  • Provides extensive examples throughout the book to complement the theory presented.
  • Accompanied by a supporting website featuring new material and solutions.

More and more students are realizing that they need to learn Bayesian statistics to meet their academic and professional goals. This book is best suited for use as a main text in courses on Bayesian statistics for third and fourth year undergraduates and postgraduate students.

Read More Show Less

Editorial Reviews

From the Publisher
“As a lifelong non-statistician and sporadic “user” of statistics, I have not come across another advanced statistics book (as I would characterize this one) that offers so much to the non-expert and, I’ll bet, to the expert as well. The book has my highest recommendation.” (Computing Reviews, 7 January 2013)
Read More Show Less

Product Details

  • ISBN-13: 9781118359778
  • Publisher: Wiley
  • Publication date: 6/25/2012
  • Sold by: Barnes & Noble
  • Format: eBook
  • Edition number: 4
  • Pages: 488
  • File size: 13 MB
  • Note: This product may take a few minutes to download.

Meet the Author

Peter M. Lee is Provost of Wentworth College, University of York, UK.
Read More Show Less

Table of Contents

Preface xix

Preface to the First Edition xxi

1 Preliminaries 1

1.1 Probability and Bayes’ Theorem 1

1.2 Examples on Bayes’ Theorem 9

1.3 Random variables 12

1.4 Several random variables 17

1.5 Means and variances 23

1.6 Exercises on Chapter 1 31

2 Bayesian inference for the normal distribution 36

2.1 Nature of Bayesian inference 36

2.2 Normal prior and likelihood 40

2.3 Several normal observations with a normal prior 44

2.4 Dominant likelihoods 48

2.5 Locally uniform priors 50

2.6 Highest density regions 54

2.7 Normal variance 55

2.8 HDRs for the normal variance 59

2.9 The role of sufficiency 60

2.10 Conjugate prior distributions 67

2.11 The exponential family 71

2.12 Normal mean and variance both unknown 73

2.13 Conjugate joint prior for the normal distribution 78

2.14 Exercises on Chapter 2 82

3 Some other common distributions 85

3.1 The binomial distribution 85

3.2 Reference prior for the binomial likelihood 92

3.3 Jeffreys’ rule 96

3.4 The Poisson distribution 102

3.5 The uniform distribution 106

3.6 Reference prior for the uniform distribution 110

3.6.1 Lower limit of the interval fixed 110

3.7 The tramcar problem 113

3.8 The first digit problem; invariant priors 114

3.9 The circular normal distribution 117

3.10 Approximations based on the likelihood 122

3.11 Reference posterior distributions 128

3.12 Exercises on Chapter 3 134

4 Hypothesis testing 138

4.1 Hypothesis testing 138

4.2 One-sided hypothesis tests 143

4.3 Lindley’s method 145

4.4 Point (or sharp) null hypotheses with prior information 146

4.5 Point null hypotheses for the normal distribution 150

4.6 The Doogian philosophy 157

4.7 Exercises on Chapter 4 158

5 Two-sample problems 162

5.1 Two-sample problems – both variances unknown 162

5.2 Variances unknown but equal 165

5.3 Variances unknown and unequal (Behrens–Fisher problem) 168

5.4 The Behrens–Fisher controversy 171

5.5 Inferences concerning a variance ratio 173

5.6 Comparison of two proportions; the 2 × 2 table 176

5.7 Exercises on Chapter 5 179

6 Correlation, regression and the analysis of variance 182

6.1 Theory of the correlation coefficient 182

6.2 Examples on the use of the correlation coefficient 189

6.3 Regression and the bivariate normal model 190

6.4 Conjugate prior for the bivariate regression model 197

6.5 Comparison of several means – the one way model 200

6.6 The two way layout 209

6.7 The general linear model 212

6.8 Exercises on Chapter 6 217

7 Other topics 221

7.1 The likelihood principle 221

7.2 The stopping rule principle 226

7.3 Informative stopping rules 229

7.4 The likelihood principle and reference priors 232

7.5 Bayesian decision theory 234

7.6 Bayes linear methods 240

7.7 Decision theory and hypothesis testing 243

7.8 Empirical Bayes methods 245

7.9 Exercises on Chapter 7 247

8 Hierarchical models 253

8.1 The idea of a hierarchical model 253

8.2 The hierarchical normal model 258

8.3 The baseball example 262

8.4 The Stein estimator 264

8.5 Bayesian analysis for an unknown overall mean 268

8.6 The general linear model revisited 272

8.7 Exercises on Chapter 8 277

9 The Gibbs sampler and other numerical methods 281

9.1 Introduction to numerical methods 281

9.2 The EM algorithm 283

9.3 Data augmentation by Monte Carlo 291

9.4 The Gibbs sampler 294

9.5 Rejection sampling 311

9.6 The Metropolis–Hastings algorithm 317

9.7 Introduction to WinBUGS and OpenBUGS 323

9.8 Generalized linear models 332

9.9 Exercises on Chapter 9 335

10 Some approximate methods 340

10.1 Bayesian importance sampling 340

10.2 Variational Bayesian methods: simple case 345

10.3 Variational Bayesian methods: general case 353

10.4 ABC: Approximate Bayesian Computation 356

10.5 Reversible jump Markov chain Monte Carlo 367

10.6 Exercises on Chapter 10 369

Appendix A Common statistical distributions 373

A.1 Normal distribution 374

A.2 Chi-squared distribution 375

A.3 Normal approximation to chi-squared 376

A.4 Gamma distribution 376

A.5 Inverse chi-squared distribution 377

A.6 Inverse chi distribution 378

A.7 Log chi-squared distribution 379

A.8 Student’s t distribution 380

A.9 Normal/chi-squared distribution 381

A.10 Beta distribution 382

A.11 Binomial distribution 383

A.12 Poisson distribution 384

A.13 Negative binomial distribution 385

A.14 Hypergeometric distribution 386

A.15 Uniform distribution 387

A.16 Pareto distribution 388

A.17 Circular normal distribution 389

A.18 Behrens’ distribution 391

A.19 Snedecor’s F distribution 393

A.20 Fisher’s z distribution 393

A.21 Cauchy distribution 394

A.22 The probability that one beta variable is greater than another 395

A.23 Bivariate normal distribution 395

A.24 Multivariate normal distribution 396

A.25 Distribution of the correlation coefficient 397

Appendix B Tables 399

B.1 Percentage points of the Behrens–Fisher distribution 399

B.2 Highest density regions for the chi-squared distribution 402

B.3 HDRs for the inverse chi-squared distribution 404

B.4 Chi-squared corresponding to HDRs for log chi-squared 406

B.5 Values of F corresponding to HDRs for log F 408

Appendix C R programs 430

Appendix D Further reading 436

D.1 Robustness 436

D.2 Nonparametric methods 436

D.3 Multivariate estimation 436

D.4 Time series and forecasting 437

D.5 Sequential methods 437

D.6 Numerical methods 437

D.7 Bayesian networks 437

D.8 General reading 438

References 439

Index 455

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star


4 Star


3 Star


2 Star


1 Star


Your Rating:

Your Name: Create a Pen Name or

Barnes & Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation


  • - By submitting a review, you grant to Barnes & and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Terms of Use.
  • - Barnes & reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)