Scalable Monte Carlo for Bayesian Learning
A graduate-level introduction to advanced topics in Markov chain Monte Carlo (MCMC), as applied broadly in the Bayesian computational context. The topics covered have emerged as recently as the last decade and include stochastic gradient MCMC, non-reversible MCMC, continuous time MCMC, and new techniques for convergence assessment. A particular focus is on cutting-edge methods that are scalable with respect to either the amount of data, or the data dimension, motivated by the emerging high-priority application areas in machine learning and AI. Examples are woven throughout the text to demonstrate how scalable Bayesian learning methods can be implemented. This text could form the basis for a course and is sure to be an invaluable resource for researchers in the field.
1147116916
Scalable Monte Carlo for Bayesian Learning
A graduate-level introduction to advanced topics in Markov chain Monte Carlo (MCMC), as applied broadly in the Bayesian computational context. The topics covered have emerged as recently as the last decade and include stochastic gradient MCMC, non-reversible MCMC, continuous time MCMC, and new techniques for convergence assessment. A particular focus is on cutting-edge methods that are scalable with respect to either the amount of data, or the data dimension, motivated by the emerging high-priority application areas in machine learning and AI. Examples are woven throughout the text to demonstrate how scalable Bayesian learning methods can be implemented. This text could form the basis for a course and is sure to be an invaluable resource for researchers in the field.
64.99 In Stock
Scalable Monte Carlo for Bayesian Learning

Scalable Monte Carlo for Bayesian Learning

Scalable Monte Carlo for Bayesian Learning

Scalable Monte Carlo for Bayesian Learning

Hardcover

$64.99 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

A graduate-level introduction to advanced topics in Markov chain Monte Carlo (MCMC), as applied broadly in the Bayesian computational context. The topics covered have emerged as recently as the last decade and include stochastic gradient MCMC, non-reversible MCMC, continuous time MCMC, and new techniques for convergence assessment. A particular focus is on cutting-edge methods that are scalable with respect to either the amount of data, or the data dimension, motivated by the emerging high-priority application areas in machine learning and AI. Examples are woven throughout the text to demonstrate how scalable Bayesian learning methods can be implemented. This text could form the basis for a course and is sure to be an invaluable resource for researchers in the field.

Product Details

ISBN-13: 9781009288446
Publisher: Cambridge University Press
Publication date: 06/05/2025
Series: Institute of Mathematical Statistics Monographs
Pages: 247
Product dimensions: 5.98(w) x 9.02(h) x 0.63(d)

About the Author

Paul Fearnhead is Professor of Statistics at Lancaster University, with research interests in Bayesian and Computational Statistics. He has been awarded Cambridge University's Adams prize, and the Guy Medals in Bronze and Silver from the Royal Statistical Society. He was elected a fellow of the International Society for Bayesian Analysis in 2024 and is currently the Editor of Biometrika.

Christopher Nemeth is Professor of Statistics at Lancaster University, working at the interface of Statistics and Machine Learning, with a focus on probabilistic modelling and the development of new computational tools for statistical inference. In 2020, he was awarded a UKRI Turing AI Fellowship to develop new algorithms for probabilistic AI.

Chris. J. Oates leads a team working in the areas of Computational Statistics and Probabilistic Machine Learning at Newcastle University. He was awarded a Leverhulme Prize for Mathematics and Statistics in 2023, and the Guy Medal in Bronze of the Royal Statistical Society in 2024.

Chris Sherlock is Professor of Statistics at Lancaster University. After working in data assimilation, numerical modelling and software engineering, he was caught up in the excitement of Computationally Intensive Bayesian Statistics, obtaining a Ph.D. in the topic and now leading a group of like-minded researchers.

Table of Contents

Preface; 1. Background; 2. Reversible MCMC and its Scaling; 3. Stochastic Gradient MCMC Algorithms; 4. Non-Reversible MCMC; 5. Continuous-Time MCMC; 6. Assessing and Improving MCMC; References; Index.
From the B&N Reads Blog

Customer Reviews