A First Course in Probability / Edition 6

A First Course in Probability / Edition 6

by Sheldon Ross
ISBN-10:
0130338516
ISBN-13:
9780130338518
Pub. Date:
07/01/2001
Publisher:
Prentice Hall
ISBN-10:
0130338516
ISBN-13:
9780130338518
Pub. Date:
07/01/2001
Publisher:
Prentice Hall
A First Course in Probability / Edition 6

A First Course in Probability / Edition 6

by Sheldon Ross
$106.67 Current price is , Original price is $106.67. You
$106.67 
  • SHIP THIS ITEM
    This item is available online through Marketplace sellers.
  • PICK UP IN STORE
    Check Availability at Nearby Stores
$100.00 
  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.

    • Condition: Good
    Note: Access code and/or supplemental material are not guaranteed to be included with used textbook.

This item is available online through Marketplace sellers.


Overview

This market-leading introduction to probability features exceptionally clear explanations of the mathematics of probability theory and explores its many diverse applications through numerous interesting and motivational examples. The outstanding problem sets are a hallmark feature of this book. Provides clear, complete explanations to fully explain mathematical concepts. Features subsections on the probabilistic method and the maximum-minimums identity. Includes many new examples relating to DNA matching, utility, finance, and applications of the probabilistic method. Features an intuitive treatment of probability—intuitive explanations follow many examples. The Probability Models Disk included with each copy of the book, contains six probability models that are referenced in the book and allow readers to quickly and easily perform calculations and simulations.

Product Details

ISBN-13: 9780130338518
Publisher: Prentice Hall
Publication date: 07/01/2001
Edition description: Older Edition
Pages: 528
Product dimensions: 7.24(w) x 9.30(h) x 0.98(d)

Table of Contents

Prefacevi
1Combinatorial Analysis1
1.1Introduction1
1.2The Basic Principle of Counting2
1.3Permutations3
1.4Combinations5
1.5Multinomial Coefficients10
1.6The Number of Integer Solutions of Equations12
Summary15
Problems15
Theoretical Exercises18
Self-Test Problems and Exercises22
2Axioms of Probability24
2.1Introduction24
2.2Sample Space and Events24
2.3Axioms of Probability28
2.4Some Simple Propositions31
2.5Sample Spaces Having Equally Likely Outcomes35
2.6Probability As a Continuous Set Function47
2.7Probability As a Measure of Belief51
Summary52
Problems53
Theoretical Exercises59
Self-Test Problems and Exercises61
3Conditional Probability and Independence64
3.1Introduction64
3.2Conditional Probabilities64
3.3Bayes' Formula69
3.4Independent Events83
3.5P(-[middle dot]F) is a Probability96
Summary103
Problems104
Theoretical Exercises115
Self-Test Problems and Exercises119
4Random Variables122
4.1Random Variables122
4.2Discrete Random Variables127
4.3Expected Value130
4.4Expectatio of a Function of a Random Variable133
4.5Variance137
4.6The Bernoulli and Binomial Random Variables139
4.6.1Properties of Binomial Random Variables144
4.6.2Computing the Binomial Distribution Function147
4.7The Poisson Random Variable149
4.7.1Computing the Poisson Distribution Function157
4.8Other Discrete Probability Distribution158
4.8.1The Geometric Random Variable158
4.8.2The Negative Binomial Random Variable160
4.8.3The Hypergeometric Random Variable162
4.8.4The Zeta (or Zipf) distribution166
4.9Properties of the Cumulative Distribution Function166
Summary169
Problems171
Theoretical Exercises180
Self-Test Problems and Exercises184
5Continuous Random Variables187
5.1Introduction187
5.2Expectation and Variance of Continuous Random Variables190
5.3The Uniform Random Variable195
5.4Normal Random Variables199
5.4.1The Normal Approximation to the Binomial Distribution206
5.5Exponential Random Variables210
5.5.1Hazard Rate Functions215
5.6Other Continuous Distributions217
5.6.1The Gamma Distribution217
5.6.2The Weibull Distribution220
5.6.3The Cauchy Distribution220
5.6.4The Beta Distribution221
5.7The Distribution of a Function of a Random Variable223
Summary225
Problems228
Theoretical Exercises232
Self-Test Problems and Exercises235
6Jointly Distributed Random Variables239
6.1Joint Distribution Functions239
6.2Independent Random Variables248
6.3Sums of Independent Random Variables260
6.4Conditional Distributions: Discrete Case268
6.5Conditional Distributions: Continuous Case270
6.6Order Statistics273
6.7Joint Probability Distribution of Functions of Random Variables277
6.8Exchangeable Random Variables285
Summary288
Problems290
Theoretical Exercises296
Self-Test Problem and Exercises299
7Properties of Expectation304
7.1Introduction304
7.2Expectation of Sums of Random Variables305
7.2.1Obtaining Bounds from Expectations via the Probabilistic Method321
7.2.2The Maximum-Minimums Identity324
7.3Covariance, Variance of Sums, and Correlations327
7.4Conditional Expectation340
7.4.1Definitions340
7.4.2Computing Expectations by Conditioning343
7.4.3Computing Probabilities by Conditioning350
7.4.4Conditional Variance354
7.5Conditional Expectation and Prediction356
7.6Moment Generating Functions361
7.6.1Joint Moment Generating Functions371
7.7Additional Properties of Normal Random Variables373
7.7.1The Multivariate Normal Distribution373
7.7.2The Joint Distribution of the Sample Mean and Sample Variance374
7.8General Definition of Expectation375
Summary377
Problems379
Theoretical Exercises389
Self-Test Problems and Exercises397
8Limit Theorems400
8.1Introduction400
8.2Chebyshev's Inequality and the Weak Law of Large Numbers400
8.3The Central Limit Theorem403
8.4The Strong Law of Large Numbers412
8.5Other Inequalities417
8.6Bounding the Error Probability When Approximating a Sum of Independent Bernoulli Random Variables by a Poisson424
Summary426
Problems427
Theoretical Exercises429
Self-Test Problems and Exercises430
9Additional Topics in Probability432
9.1The Poisson Process432
9.2Markov Chains435
9.3Surprise, Uncertainty, and Entropy440
9.4Coding Theory and Entropy445
Summary451
Theoretical Exercises and Problems452
Self-Test Problems and Exercises454
References454
10Simulation455
10.1Introduction455
10.2General Techniques for Simulating Continuous Random Variables458
10.2.1The Inverse Transformation Method458
10.2.2The Rejection Method459
10.3Simulating from Discrete Distributions465
10.4Variance Reduction Techniques467
10.4.1Use of Antithetic Variables468
10.4.2Variance Reduction by Conditioning468
10.4.3Control Variates470
Summary471
Problems471
Self-Test Problems and Exercises474
References474
Appendix AAnswers to Selected Problems475
Appendix BSolutions to Self-Test Problems and Exercises478
Index519

Preface

"We see that the theory of probability is at bottom only common sense reduced to calculation; it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct, often without being able to account for it .... It is remarkable that this science, which originated in the consideration of games of chance, should have become the most important object of human knowledge .... The most important questions of life are, for the most part, really only problems of probability." So said the famous French mathematician and astronomer (the "Newton of France") Pierre Simon, Marquis de Laplace. Although many people might feel that the famous marquis, who was also one of the great contributors to the development of probability, might have exaggerated somewhat, it is nevertheless true that probability theory has become a tool of fundamental importance to nearly all scientists, engineers, medical practitioners, jurists, and industrialists. In fact, the enlightened individual had learned to ask not "Is it so?" but rather "What is the probability that it is so?"

This book is intended as an elementary introduction to the theory of probability for students in mathematics, statistics, engineering, and the sciences (including computer science, the social sciences and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject.

In Chapter 1 we present the basic principles of combinatorial analysis, which are most useful in computing probabilities.

In Chapter 2 we consider theaxioms of probability theory and show how they can be applied to compute various probabilities of interest.

Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by "conditioning" reappears in Chapter 7, where we use it to obtain expectations.

In Chapters 4, 5, and 6 we introduce the concept of random variables. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5: These quantities are then determined for many of the common types of random variables.

Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and moment generating functions are contained in this chapter. In addition, the final section introduces the multi-variate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution.

In Chapter 8 we present the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy's continuity theorem. Also in this chapter we present such probability inequalities as Markov's inequality, Chebyshev's inequality, and Chernoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability for a Poisson random variable having the same expected value.

Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation.

The sixth edition continues the evolution and fine tuning of the text. There are many new exercises and examples. Among the latter are examples on utility (Example 4c of Chapter 4), on normal approximations (Example 4i of Chapter 5), on applying the lognormal distribution to finance (Example 3d of Chapter 6), and on coupon collecting with general collection probabilities (Example 2v of Chapter 7). There are also new optional subsections in Chapter 7 dealing with the probabilistic method (Subsection 7.2.1), and with the maximum-minimums identity (Subsection 7.2.2).

As in the previous edition, three sets of exercises are given at the end of each chapter. They are designated as Problems, Theoretical Exercises, and Self-Test Problems and Exercises. This last set of exercises, for which complete solutions appear in Appendix B, is designed to help students test their comprehension and study for exams.

Using the website students will be able to perform calculations and simulations quickly and easily in six key areas:

  • Three of the modules derive probabilities for, respectively, binomial, Poisson, and normal random variables.
  • Another module illustrates the central limit theorem. It considers random variables that take on one of the values 0,1, 2, 3, 4 and allows the user to enter the probabilities for these values along with a number n. The module then plots the probability mass function of the sum of n independent random variables of this type. By increasing n one can "see" the mass function converge to the shape of a normal density function.
  • The other two modules illustrate the strong law of large numbers. Again the user enters probabilities for the five possible values of the random variable along with an integer n. The program then uses random numbers to simulate n random variables having the prescribed distribution. The modules graph the number of times each outcome occurs along with the average of all outcomes. The modules differ in how they graph the results of the trials.
From the B&N Reads Blog

Customer Reviews