NONLINEAR MIXTURE MODELS: A Bayesian Approach

NONLINEAR MIXTURE MODELS: A Bayesian Approach

NONLINEAR MIXTURE MODELS: A Bayesian Approach

NONLINEAR MIXTURE MODELS: A Bayesian Approach

eBook

$32.49  $43.00 Save 24% Current price is $32.49, Original price is $43. You Save 24%.

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

This book, written by two mathematicians from the University of Southern California, provides a broad introduction to the important subject of nonlinear mixture models from a Bayesian perspective. It contains background material, a brief description of Markov chain theory, as well as novel algorithms and their applications. It is self-contained and unified in presentation, which makes it ideal for use as an advanced textbook by graduate students and as a reference for independent researchers. The explanations in the book are detailed enough to capture the interest of the curious reader, and complete enough to provide the necessary background material needed to go further into the subject and explore the research literature.In this book the authors present Bayesian methods of analysis for nonlinear, hierarchical mixture models, with a finite, but possibly unknown, number of components. These methods are then applied to various problems including population pharmacokinetics and gene expression analysis. In population pharmacokinetics, the nonlinear mixture model, based on previous clinical data, becomes the prior distribution for individual therapy. For gene expression data, one application included in the book is to determine which genes should be associated with the same component of the mixture (also known as a clustering problem). The book also contains examples of computer programs written in BUGS. This is the first book of its kind to cover many of the topics in this field.

Product Details

ISBN-13: 9781783266272
Publisher: Imperial College Press
Publication date: 12/30/2014
Sold by: Barnes & Noble
Format: eBook
Pages: 296
File size: 10 MB

Table of Contents

List of Tables xiii

List of Figures xvii

Acknowledgments xxv

1 Introduction 1

1.1 Bayesian Approach 3

1.2 Review of Applications of Mixture Models in Population Pharmacokinetics 4

1.3 Review of Applications of Mixture Models to Problems in Computational Biology 6

1.3.1 Problems with mix tine models 11

1.4 Outline of the Book 11

2 Mathematical Description of Nonlinear Mixture Models 15

2.1 Fundamental Notions of Markov chain Monte Carlo 15

2.2 Nonlinear Hierarchical Models 19

2.2.1 One-component model 19

2.2.2 Mixture models 21

2.2.3 Normal mixture models 23

2.2.4 Nonlinear normal mixture models and pharmacokinetic examples 24

2.2.5 Linear normal mixture models and Eyes example 26

2.2.6 Allocation formulation 29

2.3 Gibbs Sampling 30

2.3.1 Theoretical convergence of the Gibbs sampler 32

2.3.2 Irreducibility and aperiodicity of the hybrid Gibbs-Metropolis chain 33

2.3.3 WinBUGS and JAGS 35

2.4 Prior Distributions: Linear and Nonlinear Cases 38

3 Label Switching and Trapping 43

3.1 Label Switching and Permutation In variance 13

3.1.1 Label switching 43

3.1.2 Permutation invariance 44

3.1.3 Allocation variables {Zi} 40

3.1.4 An example of label switching 47

3.2 Markov Chain Convergence 48

3.2.1 Trapping states and convergence 54

3.3 Random Permutation Sampler 59

3.3.1 RPS post processing 63

3.4 Re-parametrization 65

3.5 Stephens' Approach: Relabeling Strategies 67

3.5.1 Linear normal mixture: Eyes problem 70

4 Treatment of Mixture Models with an Unknown Number of Components 73

4.1 Introduction 73

4.2 Finding the Optimal Number of Components Using Weighted Kullback Leibler Distance 74

4.2.1 Distance between K-component model and collapsed (K-1)-component model 75

4.2.2 Weighted Kullback-Leibler distance for the mixture of multivariate normals 76

4.2.3 Weighted Kullback-Leibler distance for the multivariate normal mixture model with diagonal covariance matrix 78

4.2.4 Weighted Kullback Leibler distance for the one-dimensional mixture of normals 78

4.2.5 Comparison of weighted and un-weighted Kullback Leibler distances for one-dimensional mixture of normals 79

4.2.6 Weighted Kullback-Leibler distance for a mixture of binomial distributions 80

4.2.7 Weighted Kullback Leibler distance for a mixture of Poisson distributions 81

4.2.8 Metric and semimetric 82

4.2.9 Determination of the number of components in the Bayesian framework 85

4.3 Stephens' Approach: Birth-Death Markov Chain Monte Carlo 86

4.3.1 Treatment of the Eyes model 89

4.3.2 Treatment of nonlinear normal mixture models 91

4.4 Kullback Leibler Markov Chain Monte Carlo - A New Algorithm for Finite Mixture Analysis 91

4.4.1 Kullback-Leibler Markov chain Monte Carlo with random permutation sampler 98

5 Applications of BDMCMC, KLMCMC, and HPS 101

5.1 Galaxy Data 101

5.2 Simulated Nonlinear Normal Mixture Model 109

5.3 Linear Normal Mixture Model: Boys and Girls 112

5.4 Nonlinear Pharmacokinetics Model and Selection of Prior Distributions 121

5.5 Nonlinear Mixture Models in Gene Expression Studies 135

5.5.1 Simulated dataset 138

6 Nonparametric Methods 143

6 L Definition of the Basic Model 144

6.2 Nonparametric Maximum Likelihood 145

6.2.1 Bender's decomposition 146

6.2.2 Convex optimization over a fixed grid G 147

6.2.3 Interior point methods 148

6.2.4 Nonparametric adaptive grid algorithm 149

6.2.5 Pharmacokinetic population analysis problem 151

6.3 Nonparametric Bayesian Approach 154

6.3.1 Linear model 158

6.3.2 Dirichlet distribution 159

6.3.3 Dirichlet process for discrete distributions 159

6.1 Gibbs Sampler for the Dirichlet Process 161

6.4.1 Implementation of the Gibbs sampler 162

6.5 Nonparametric Bayesian Examples 164

6.5.1 Binomial/beta model 164

6.5.2 Normal prior and linear model 166

6.5.3 One-dimensional linear case 168

6.5.4 Two-dimensional linear case 169

6.5.5 Plotting the posterior using Gibbs sampling 170

6.5.6 Galaxy dataset 170

6.6 Technical Notes 171

6.6.1 Treatment of multiplicity 172

6.7 Stick-Breaking Priors 175

6.7.1 Truncations of the stick-breaking process 176

6.7.2 Blocked Gibbs algorithm 176

6.8 Examples of Stick-Breaking 178

6.8.1 Binomial/beta: Rolling thumbtacks example 179

6.8.2 Determination of the number of mixture components in the binomial/beta model 181

6.8.3 Poisson/gamma Eye-Tracking example 186

6.8.4 Determination of the number of mixture components in Poisson/gamma case 187

6.8.5 Stephens' relabeling for Poisson/gamma case 191

6.8.6 Pharmacokinetics example 192

6.9 Maximum Likelihood and Stick-Breaking (A Connection Between NPML and NPB Approaches) 196

6.9.1 Galaxy dataset 197

7 Bayesian Clustering Methods 199

7.1 Brief Review of Clustering Methods in Microarray Analysis 199

7.2 Application of KLMCMC to Gene Expression Time-Series Analysis 201

7.3 Kullbaok Leibler Clustering 205

7.3.1 Pharmacokinetic example 209

7.4 Simulated Time-Series Data with an Unknown Number of Components (Zhou Model) 213

7.4.1 Model description 214

7.4.2 Reductive stepwise method 215

7.4.3 Zhou model using KLC algorithm 217

7.4.4 Zhou model using KLMCMC 218

7.5 Transcription Stmt Sites Prediction 222

7.6 Conclusions 228

Appendix A Standard Probability Distributions 231

Appendix B Full Conditional Distributions 233

B.1 Binomial/Beta 235

Appendix C Computation of the Weighted Kullback-Leibler Distance 237

C.1 Weighted Kullback-Leibler Distance for a Mixture of Univariate Normals 237

C 2 Weighted Kullback-Leibler Distance for a Mixture of Multivariate Normals 239

C 3 Weighted Kullback-Leibler Distance for a Mixtme of Beta Distributions 242

Appendix D BUGS Codes 245

D.1 BUGS Code for the Eyes Model 245

D.2 BUGS Code for the Boys and Girls Example 248

D.3 BUGS Code for Thumbtack Data 249

D.4 BUGS Code for Eye-Tracking Data 251

D.5 BUGS Code for PK Example, Nonparametric Bayesian Approach with Stick-Breaking Priors 253

Bibliography 255

Index 267

From the B&N Reads Blog

Customer Reviews