ISBN-10:
1848213611
ISBN-13:
9781848213616
Pub. Date:
05/01/2012
Publisher:
Wiley
Mathematical Statistics and Stochastic Processes / Edition 1

Mathematical Statistics and Stochastic Processes / Edition 1

by Denis Bosq
Current price is , Original price is $139.0. You

Temporarily Out of Stock Online

Please check back later for updated availability.

Product Details

ISBN-13: 9781848213616
Publisher: Wiley
Publication date: 05/01/2012
Series: ISTE Series , #688
Pages: 304
Product dimensions: 6.30(w) x 9.30(h) x 0.90(d)

About the Author

Denis Bosq is Professor emeritus Université Pierre et Marie Curie (Paris 6) France.

Table of Contents

Preface xiii

Part 1 Mathematical Statistics 1

Chapter 1 Introduction to Mathematical Statistics 3

1.1 Generalities 3

1.2 Examples of statistics problems 4

1.2.1 Quality control 4

1.2.2 Measurement errors 5

1.2.3 Filtering 5

1.2.4 Confidence intervals 6

1.2.5 Homogeneity testing 7

Chapter 2 Principles of Decision Theory 9

2.1 Generalities 9

2.2 The problem of choosing a decision function 11

2.3 Principles of Bayesian statistics 13

2.3.1 Generalities 13

2.3.2 Determination of Bayesian decision functions 14

2.3.3 Admissibility of Bayes' rules 16

2.4 Complete classes 17

2.5 Criticism of decision theory - the asymptotic point of view 18

2.6 Exercises 18

Chapter 3 Conditional Expectation 21

3.1 Definition 21

3.2 Properties and extension 22

3.3 Conditional probabilities and conditional distributions 24

3.3.1 Regular version of the conditional probability 24

3.3.2 Conditional distributions 24

3.3.3 Theorem for integration with respect to the conditional distribution 25

3.3.4 Determination of the conditional distributions in the usual cases 25

3.4 Exercises 26

Chapter 4 Statistics and Sufficiency 29

4.1 Samples and empirical distributions 29

4.1.1 Properties of the empirical distribution and the associated statistics 30

4.2 Sufficiency 31

4.2.1 The factorization theorem 31

4.3 Examples of sufficient statistics - an exponential model 33

4.4 Use of a sufficient statistic 35

4.5 Exercises 36

Chapter 5 Point Estimation 39

5.1 Generalities 39

5.1.1 Definition - examples 39

5.1.2 Choice of a preference relation 40

5.2 Sufficiency and completeness 42

5.2.1 Sufficiency 42

5.2.2 Complete statistics 43

5.3 The maximum-likelihood method 45

5.3.1 Definition 45

5.3.2 Maximum likelihood and sufficiency 47

5.3.3 Calculating maximum-likelihood estimators 47

5.3.3.1 The Newton-Raphson method 48

5.4 Optimal unbiased estimators 49

5.4.1 Unbiased estimation 49

5.4.1.1 Existence of an unbiased estimator 50

5.4.2 Unbiased minimum-dispersion estimator 51

5.4.2.1 Application to an exponential model 52

5.4.2.2 Application to the Gaussian model 54

5.4.2.3 Use of the Lehmann-Scheffé theorem 55

5.4.3 Criticism of unbiased estimators 55

5.5 Efficiency of an estimator 56

5.5.1 The Fréchet-Darmois-Cramer-Rao inequality 56

5.5.1.1 Calculating I(θ) 57

5.5.1.2 Properties of the Fisher information 58

5.5.1.3 The case of a biased estimator 59

5.5.2 Efficiency 60

5.5.3 Extension to Rk 60

5.5.3.1 Properties of the information matrix 62

5.5.3.2 Efficiency 62

5.5.4 The non-regular case 64

5.5.4.1 "Superefficient" estimators 64

5.5.4.2 Cramer-Rao-type inequalities 64

5.6 The linear regression model 65

5.6.1 Generalities 65

5.6.2 Estimation of the parameter - the Gauss-Markov theorem 66

5.7 Exercises 68

Chapter 6 Hypothesis Testing and Confidence Regions 73

6.1 Generalities 73

6.1.1 The problem 73

6.1.2 Use of decision theory 73

6.1.2.1 Preference relation 74

6.1.3 Generalization 74

6.1.3.1 Preference relation 75

6.1.4 Sufficiency 75

6.2 The Neyman-Pearson (NP) lemma 75

6.3 Multiple hypothesis tests (general methods) 80

6.3.1 Testing a simple hypothesis against a composite one 80

6.3.1.1 The γ test 81

6.3.1.2 The λ test 81

6.3.2 General case - unbiased tests 82

6.3.2.1 Relation between unbiased tests and unbiased decision functions 83

6.4 Case where the ratio of the likelihoods is monotonic 84

6.4.1 Generalities 84

6.4.2 Unilateral tests 84

6.4.3 Bilateral tests 85

6.5 Tests relating to the normal distribution 86

6.6 Application to estimation: confidence regions 86

6.6.1 First preference relation on confidence regions 88

6.6.2 Relation to tests 88

6.7 Exercises 90

Chapter 7 Asymptotic Statistics 101

7.1 Generalities 101

7.2 Consistency of the maximum likelihood estimator 103

7.3 The limiting distribution of the maximum likelihood estimator 104

7.4 The likelihood ratio test 106

7.5 Exercises 108

Chapter 8 Non-Parametric Methods and Robustness 113

8.1 Generalities 113

8.2 Non-parametric estimation 114

8.2.1 Empirical estimators 114

8.2.2 Distribution and density estimation 114

8.2.2.1 Convergence of the estimator 115

8.2.3 Regression estimation 116

8.3 Non-parametric tests 117

8.3.1 The χ2 test 117

8.3.2 The Kolmogorov-Smirnov test 120

8.3.3 The Cramer-von Mises test 120

8.3.4 Rank test 121

8.4 Robustness 121

8.4.1 An example of a robust test 122

8.4.2 An example of a robust estimator 122

8.4.3 A general definition of a robust estimator 123

8.5 Exercises 124

Part 2 Statistics for Stochastic Processes 131

Chapter 9 Introduction to Statistics for Stochastic Processes 133

9.1 Modeling a family of observations 133

9.2 Processes 134

9.2.1 The distribution of a process 134

9.2.2 Gaussian processes 135

9.2.3 Stationary processes 135

9.2.4 Markov processes 136

9.3 Statistics for stochastic processes 137

9.4 Exercises 138

Chapter 10 Weakly Stationary Discrete-Time Processes 141

10.1 Autocovariance and spectral density 141

10.2 Linear prediction and Wold decomposition 144

10.3 Linear processes and the ARMA model 146

10.3.1 Spectral density of a linear process 147

10.4 Estimating the mean of a weakly stationary process 149

10.5 Estimating the autocovariance 151

10.6 Estimating the spectral density 151

10.6.1 The periodogram 151

10.6.2 Convergent estimators of the spectral density 154

10.7 Exercises 155

Chapter 11 Poisson Processes - A Probabilistic and Statistical Study 163

11.1 Introduction 163

11.2 The axioms of Poisson processes 164

11.3 Interarrival time 166

11.4 Properties of the Poisson process 168

11.5 Notions on generalized Poisson processes 170

11.6 Statistics of Poisson processes 172

11.7 Exercises 177

Chapter 12 Square-Integrable Continuous-Time Processes 183

12.1 Definitions 183

12.2 Mean-square continuity 183

12.3 Mean-square integration 184

12.4 Mean-square differentiation 187

12.5 The Karhunen-Loeve theorem 188

12.6 Wiener processes 189

12.6.1 Karhunen-Loeve decomposition 192

12.6.2 Statistics of Wiener processes 193

12.7 Notions on weakly stationary continuous-time processes 195

12.7.1 Estimating the mean 196

12.7.2 Estimating the autocovariance 196

12.7.3 The case of a process observed at discrete instants 197

12.8 Exercises 197

Chapter 13 Stochastic Integration and Diffusion Processes 203

13.1 Itô integral 203

13.2 Diffusion processes 206

13.3 Processes defined by stochastic differential equations and stochastic integrals 212

13.4 Notions on statistics for diffusion processes 215

13.5 Exercises 216

Chapter 14 ARMA Processes 219

14.1 Autoregressive processes 219

14.2 Moving average processes 223

14.3 General ARMA processes 224

14.4 Non-stationary models 226

14.4.1 The Box-Cox transformation 226

14.4.2 Eliminating the trend by differentiation 227

14.4.3 Eliminating the seasonality 227

14.4.4 Introducing exogenous variables 228

14.5 Statistics of ARMA processes 228

14.5.1 Identification 228

14.5.2 Estimation 230

14.5.3 Verification 232

14.6 Multidimensional processes 232

14.7 Exercises 233

Chapter 15 Prediction 239

15.1 Generalities 239

15.2 Empirical methods of prediction 240

15.2.1 The empirical mean 240

15.2.2 Exponential smoothing 240

15.2.3 Native predictors 241

15.2.4 Trend adjustment 241

15.3 Prediction in the ARIMA model 242

15.4 Prediction in continuous time 244

15.5 Exercises 245

Part 3 Supplement 249

Chapter 16 Elements of Probability Theory 251

16.1 Measure spaces: probability spaces 251

16.2 Measurable functions: real random variables 253

16.3 Integrating real random variables 255

16.4 Random vectors 259

16.5 Independence 261

16.6 Gaussian vectors 262

16.7 Stochastic convergence 264

16.8 Limit theorems 265

Appendix Statistical Tables 267

A1.1 Random numbers 267

A1.2 Distribution function of the standard normal distribution 268

A1.3 Density of the standard normal distribution 269

A1.4 Percentiles (tp) of Student's distribution 270

A1.5 Ninety-fifth percentiles of Fisher-Snedecor distributions 271

A1.6 Ninety-ninth percentiles of Fisher-Snedecor distributions 272

A1.7 Percentiles (Χ2p) of the & Chi;2 distribution with n degrees of freedom 273

A1.8 Individual probabilities of the Poisson distribution 274

A1.9 Cumulative probabilities of the Poisson distribution 275

A1.10 Binomial coefficients Ckn for n ≤ 30 and 0 ≤ k ≤ 7 276

A1.11 Binomial coefficients Ckn for n ≤ 30 and 8 ≤ k ≤ 15 277

Bibliography 279

Index 281

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews