Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

013603313X Orders ship same or next business day w/ free tracking. Choose Expedited shipping for fastest (2-6 business day) delivery. Satisfaction Guaranteed

A First Course in Probability, Eighth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

A book/disk introduction to probability for students in mathematics, engineering, and the sciences (including the social sciences and management science) who understand elementary calculus. Presents the mathematics of probability theory as well as many examples of applications, covering combinatorial analysis, axioms of probability theory, conditional probability, random variables, expected value, and major theoretical results of probability. Other subjects include Markov chains, information and coding theory, and simulation. Includes chapter summaries, exercises, and answers. This fifth edition notes optional material, and updates examples to be more accessible to students. Chapter exercises are reorganized to present mechanical problems before theoretical exercises. The disk, new to this edition, allows students to perform calculations and simulations. Annotation c. by Book News, Inc., Portland, Or.

Product dimensions: 7.90 (w) x 10.00 (h) x 1.00 (d)

Meet the Author

Sheldon M. Ross is a professor in the Department of Industrial Engineering and Operations Research at the University of Southern California. He received his Ph.D. in statistics at Stanford University in 1968. He has published many technical articles and textbooks in the areas of statistics and applied probability. Among his texts are A First Course in Probability, Introduction to Probability Models, Stochastic Processes, and Introductory Statistics. Professor Ross is the founding and continuing editor of the journal Probability in the Engineering and Informational Sciences, the Advisory Editor for International Journal of Quality Technology and Quantitative Management, and an Editorial Board Member of the Journal of Bond Trading and Management. He is a Fellow of the Institute of Mathematical Statistics and a recipient of the Humboldt US Senior Scientist Award.

"We see that the theory of probability is at bottom only common sense reduced to calculation; it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct, often without being able to account for it .... It is remarkable that this science, which originated in the consideration of games of chance, should have become the most important object of human knowledge .... The most important questions of life are, for the most part, really only problems of probability." So said the famous French mathematician and astronomer (the "Newton of France") Pierre Simon, Marquis de Laplace. Although many people might feel that the famous marquis, who was also one of the great contributors to the development of probability, might have exaggerated somewhat, it is nevertheless true that probability theory has become a tool of fundamental importance to nearly all scientists, engineers, medical practitioners, jurists, and industrialists. In fact, the enlightened individual had learned to ask not "Is it so?" but rather "What is the probability that it is so?"

This book is intended as an elementary introduction to the theory of probability for students in mathematics, statistics, engineering, and the sciences (including computer science, the social sciences and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject.

In Chapter 1 we present the basic principles of combinatorial analysis, which are most useful in computing probabilities.

In Chapter 2 we consider theaxioms of probability theory and show how they can be applied to compute various probabilities of interest.

Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by "conditioning" reappears in Chapter 7, where we use it to obtain expectations.

In Chapters 4, 5, and 6 we introduce the concept of random variables. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5: These quantities are then determined for many of the common types of random variables.

Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and moment generating functions are contained in this chapter. In addition, the final section introduces the multi-variate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution.

In Chapter 8 we present the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy's continuity theorem. Also in this chapter we present such probability inequalities as Markov's inequality, Chebyshev's inequality, and Chernoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability for a Poisson random variable having the same expected value.

Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation.

The sixth edition continues the evolution and fine tuning of the text. There are many new exercises and examples. Among the latter are examples on utility (Example 4c of Chapter 4), on normal approximations (Example 4i of Chapter 5), on applying the lognormal distribution to finance (Example 3d of Chapter 6), and on coupon collecting with general collection probabilities (Example 2v of Chapter 7). There are also new optional subsections in Chapter 7 dealing with the probabilistic method (Subsection 7.2.1), and with the maximum-minimums identity (Subsection 7.2.2).

As in the previous edition, three sets of exercises are given at the end of each chapter. They are designated as Problems, Theoretical Exercises, and Self-Test Problems and Exercises. This last set of exercises, for which complete solutions appear in Appendix B, is designed to help students test their comprehension and study for exams.

Using the website students will be able to perform calculations and simulations quickly and easily in six key areas:

Three of the modules derive probabilities for, respectively, binomial, Poisson, and normal random variables.

Another module illustrates the central limit theorem. It considers random variables that take on one of the values 0,1, 2, 3, 4 and allows the user to enter the probabilities for these values along with a number n. The module then plots the probability mass function of the sum of n independent random variables of this type. By increasing n one can "see" the mass function converge to the shape of a normal density function.

The other two modules illustrate the strong law of large numbers. Again the user enters probabilities for the five possible values of the random variable along with an integer n. The program then uses random numbers to simulate n random variables having the prescribed distribution. The modules graph the number of times each outcome occurs along with the average of all outcomes. The modules differ in how they graph the results of the trials.

Our reader reviews allow you to share your comments on titles you liked,
or didn't, with others. By submitting an online review, you are representing to
Barnes & Noble.com that all information contained in your review is original
and accurate in all respects, and that the submission of such content by you
and the posting of such content by Barnes & Noble.com does not and will not
violate the rights of any third party. Please follow the rules below to help
ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer.
However, we cannot allow persons under the age of 13 to have accounts at BN.com or
to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the
information on the product page, please send us an email.

Reviews should not contain any of the following:

- HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone

- Time-sensitive information such as tour dates, signings, lectures, etc.

- Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.

- Comments focusing on the author or that may ruin the ending for others

- Phone numbers, addresses, URLs

- Pricing and availability information or alternative ordering information

- Advertisements or commercial solicitation

Reminder:

- By submitting a review, you grant to Barnes & Noble.com and its
sublicensees the royalty-free, perpetual, irrevocable right and license to use the
review in accordance with the Barnes & Noble.com Terms of Use.

- Barnes & Noble.com reserves the right not to post any review -- particularly
those that do not follow the terms and conditions of these Rules. Barnes & Noble.com
also reserves the right to remove any review at any time without notice.

- See Terms of Use for other conditions and disclaimers.

Search for Products You'd Like to Recommend

Create a Pen Name

Welcome, penname

You have successfully created your Pen Name. Start enjoying the benefits of the BN.com Community today.

Anonymous

Posted June 4, 2009

No text was provided for this review.

If you find inappropriate content, please report it to Barnes & Noble

## More About This Textbook

## Overview

A First Course in Probability, Eighth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.## Editorial Reviews

## Booknews

A book/disk introduction to probability for students in mathematics, engineering, and the sciences (including the social sciences and management science) who understand elementary calculus. Presents the mathematics of probability theory as well as many examples of applications, covering combinatorial analysis, axioms of probability theory, conditional probability, random variables, expected value, and major theoretical results of probability. Other subjects include Markov chains, information and coding theory, and simulation. Includes chapter summaries, exercises, and answers. This fifth edition notes optional material, and updates examples to be more accessible to students. Chapter exercises are reorganized to present mechanical problems before theoretical exercises. The disk, new to this edition, allows students to perform calculations and simulations. Annotation c. by Book News, Inc., Portland, Or.## Product Details

## Meet the Author

Probability in the Engineering and Informational Sciences, the Advisory Editor forInternational Journal of Quality Technology and Quantitative Management,and an Editorial Board Member of theJournal of Bond Trading and Management.He is a Fellow of the Institute of Mathematical Statistics and a recipient of the Humboldt US Senior Scientist Award.## Table of Contents

1. Combinatorial Analysis

2. Axioms of Probability

3. Conditional Probability and Independence

4. Random Variables

5. Continuous Random Variables

6. Jointly Distributed Random Variables

7. Properties of Expectation

8. Limit Theorems

9. Additional Topics in Probability

10. Simulation

Appendix A. Answers to Selected Problems

Appendix B. Solutions to Self-Test Problems and Exercises

Index

## Preface

This book is intended as an elementary introduction to the theory of probability for students in mathematics, statistics, engineering, and the sciences (including computer science, the social sciences and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject.

In Chapter 1 we present the basic principles of combinatorial analysis, which are most useful in computing probabilities.

In Chapter 2 we consider theaxioms of probability theory and show how they can be applied to compute various probabilities of interest.

Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by "conditioning" reappears in Chapter 7, where we use it to obtain expectations.

In Chapters 4, 5, and 6 we introduce the concept of random variables. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5: These quantities are then determined for many of the common types of random variables.

Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and moment generating functions are contained in this chapter. In addition, the final section introduces the multi-variate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution.

In Chapter 8 we present the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy's continuity theorem. Also in this chapter we present such probability inequalities as Markov's inequality, Chebyshev's inequality, and Chernoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability for a Poisson random variable having the same expected value.

Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation.

The sixth edition continues the evolution and fine tuning of the text. There are many new exercises and examples. Among the latter are examples on utility (Example 4c of Chapter 4), on normal approximations (Example 4i of Chapter 5), on applying the lognormal distribution to finance (Example 3d of Chapter 6), and on coupon collecting with general collection probabilities (Example 2v of Chapter 7). There are also new optional subsections in Chapter 7 dealing with the probabilistic method (Subsection 7.2.1), and with the maximum-minimums identity (Subsection 7.2.2).

As in the previous edition, three sets of exercises are given at the end of each chapter. They are designated as

Using the website students will be able to perform calculations and simulations quickly and easily in six key areas:Problems, Theoretical Exercises,andSelf-Test Problems and Exercises.This last set of exercises, for which complete solutions appear in Appendix B, is designed to help students test their comprehension and study for exams.n.The module then plots the probability mass function of the sum ofnindependent random variables of this type. By increasingnone can "see" the mass function converge to the shape of a normal density function.n.The program then uses random numbers to simulatenrandom variables having the prescribed distribution. The modules graph the number of times each outcome occurs along with the average of all outcomes. The modules differ in how they graph the results of the trials.