Uhoh, it looks like your Internet Explorer is out of date.
For a better shopping experience, please upgrade now.
Fundamentals of Probability / Edition 3 available in Hardcover, Paperback
Buy New
$99.05Buy Used
$61.52

This Item is Not Available
This Item is Not Available
This Item is Not Available
Temporarily Out of Stock Online
Please check back later for updated availability.
This Item is Not Available
This Item is Not Available
Overview
ADVERTISEMENT
Product Details
ISBN13:  2900131453400 

Publisher:  Pearson 
Publication date:  07/22/2004 
Edition description:  3RD 
Pages:  608 
Product dimensions:  6.00(w) x 1.25(h) x 9.00(d) 
Read an Excerpt
This one or twoterm basic probability text is written for majors in mathematics, physical sciences, engineering, statistics, actuarial science, business and finance, operations research, and computer science. It can also be used by students who have completed a basic calculus course. Our aim is to present probability in a natural way: through interesting and instructive examples and exercises that motivate the theory, definitions, theorems, and methodology. Examples and exercises have been carefully designed to arouse curiosity and hence encourage the students to delve into the theory with enthusiasm.
Authors are usually faced with two opposing impulses. One is a tendency to put too much into the book, because everything is important and everything has to be said the author's way! On the other hand, authors must also keep in mind a clear definition of the focus, the level, and the audience for the book, thereby choosing carefully what should be "in" and what "out." Hopefully, this book is an acceptable resolution of the tension generated by these opposing forces.
Instructors should enjoy the versatility of this text. They can choose their favorite problems and exercises from a collection of 1558 and, if necessary, omit some sections and/or theorems to teach at an appropriate level.
Exercises for most sections are divided into two categories: A and B. Those in category A are routine, and those in category B are challenging. However, not all exercises in category B are uniformly challenging. Some of those exercises are included because students find them somewhat difficult.
I have tried to maintain an approach that is mathematically rigorous and, at the same time, closely matches the historical development of probability. Whenever appropriate, I include historical remarks, and also include discussions of a number of probability problems published in recent years in journals such as Mathematics Magazine and American Mathematical Monthly. These are interesting and instructive problems that deserve discussion in classrooms.
Chapter 13 concerns computer simulation. That chapter is divided into several sections, presenting algorithms that are used to find approximate solutions to complicated probabilistic problems. These sections can be discussed independently when relevant materials from earlier chapters are being taught, or they can be discussed concurrently, toward the end of the semester. Although I believe that the emphasis should remain on concepts, methodology, and the mathematics of the subject, I also think that students should be asked to read the material on simulation and perhaps do some projects. Computer simulation is an excellent means to acquire insight into the nature of a problem, its functions, its magnitude, and the characteristics of the solution.
Other Continuing Features
 The historical roots and applications of many of the theorems and definitions are presented in detail, accompanied by suitable examples or counterexamples.
 As much as possible, examples and exercises for each section do not refer to exercises in other chapters or sections—a style that often frustrates students and instructors.
 Whenever a new concept is introduced, its relationship to preceding concepts and theorems is explained.
 Although the usual analytic proofs are given, simple probabilistic arguments are presented to promote deeper understanding of the subject.
 The book begins with discussions on probability and its definition, rather than with combinatorics. I believe that combinatorics should be taught after students have learned the preliminary concepts of probability. The advantage of this approach is that the need for methods of counting will occur naturally to students, and the connection between the two areas becomes clear from the beginning. Moreover, combinatorics becomes more interesting and enjoyable.
 Students beginning their study of probability have a tendency to think that sample spaces always have a finite number of sample points. To minimize this proclivity, the concept of random selection of a point from an interval is introduced in Chapter 1 and applied where appropriate throughout the, book. Moreover, since the basis of simulating indeterministic problems is selection of random points from (0, 1), in order to understand simulations, students need to be thoroughly familiar with that concept.
 Often, when we think of a collection of events, we have a tendency to think about them in either temporal or logical sequence. So, if, for example, a sequence of events A_{1}, A_{2}, . . . , A_{n} occur in time or in some logical order, we can usually immediately write down the probabilities P(A_{1}), P(A_{2} A_{1}), ..., P(A_{n} A_{1}A_{2} . . . A_{n1}) without much computation. However, we may be interested in probabilities of the intersection of events, or probabilities of events unconditional on the rest, or probabilities of earlier events, given later events. These three questions motivated the need for the law of multiplication, the law of total probability, and B ayes' theorem. I have given the law of multiplication a section of its own so that each of these fundamental uses of conditional probability would have its full share of attention and coverage.
 The concepts of expectation and variance are introduced early, because important concepts should be defined and used as soon as possible. One benefit of this practice is that, when random variables such as Poisson and normal are studied, the associated parameters will be understood immediately rather than remaining ambiguous until expectation and variance are introduced. Therefore, from the beginning, students will develop a natural feeling about such parameters.
 Special attention is paid to the Poisson distribution; it is made clear that this distribution is frequently applicable, for two reasons: first, because it approximates the binomial distribution and, second, it is the mathematical model for an enormous class of phenomena. The comprehensive presentation of the Poisson process and its applications can be understood by junior and seniorlevel students.
 Students often have difficulties understanding functions or quantities such as the density function of a continuous random variable and the formula for mathematical expectation. For example, they may wonder why f xf (x) dx is the appropriate definition for E(X) and why correction for continuity is necessary. I have explained the reason behind such definitions, theorems, and concepts, and have demonstrated why they are the natural extensions of discrete cases.
 The first six chapters include many examples and exercises concerning selection of random points from intervals. Consequently, in Chapter 7, when discussing uniform random variables, I have been able to calculate the distribution and (by differentiation) the density function of X, a random point from an interval (a, b). In this way the concept of a uniform random variable and the definition of its density function are readily motivated.
 In Chapters 7 and 8 the usefulness of uniform densities is shown by using many examples. In particular, applications of uniform density in geometric probability theory are emphasized.
 Normal density, arguably the most important density function, is readily motivated by De Moivre's theorem. In Section 7.2, I introduce the standard normal density, the elementary version of the central limit theorem, and the normal density just as they were developed historically. Experience shows this to be a good pedagogical approach. When teaching this approach, the normal density becomes natural and does not look like a strange function appearing out of the blue.
 Exponential random variables naturally occur as times between consecutive events of Poisson processes. The time of occurrence of the nth event of a Poisson process has a gamma distribution. For these reasons I have motivated exponential and gamma distributions by Poisson processes. In this way we can obtain many examples of exponential and gamma random variables from the abundant examples of Poisson processes already known. Another advantage is that it helps us visualize memoryless random variables by looking at the interevent times of Poisson processes.
 Joint distributions and conditioning are often trouble areas for students. A detailed explanation and many applications concerning these concepts and techniques make these materials somewhat easier for students to understand.
 The concepts of covariance and correlation are motivated thoroughly.
 A subsection on pattern appearance is presented in Section 10.1. Even though the method discussed in this subsection is intuitive and probabilistic, it should help the students understand such paradoxicallooking results as the following. On the average, it takes almost twice as many flips of a fair coin to obtain a sequence of five successive heads as it does to obtain a tail followed by four heads.
 The answers to the oddnumbered exercises are included at the end of the book.
Since 2000, when the second edition of this book was published, I have received much additional correspondence and feedback from faculty and students in this country and abroad. The comments, discussions, recommendations, and reviews helped me to improve the book in many ways. All detected errors were corrected, and the text has been finetuned for accuracy. More explanations and clarifying comments have been added to almost every section. In this edition, 278 new exercises and examples, mostly of an applied nature, have been added. More insightful and better solutions are given for a number of problems and exercises. For example, I have discussed Borel's normal number theorem, and I have presented a version of a famous set which is not an event. If a fair coin is tossed a very large number of times, the general perception is that heads occurs as often as tails. In a new subsection, in Section 11.4, I have explained what is meant by "heads occurs as often as tails."
Some of the other features of the present revision are the following:
 An introductory chapter on stochastic processes is added. That chapter covers more indepth material on Poisson processes. It also presents the basics of Markov chains, continuoustime Markov chains, and Brownian motion. The topics are covered in some depth. Therefore, the current edition has enough material for a second course in probability as well. The level of difficulty of the chapter on stochastic processes is consistent with the rest of the book. I believe the explanations in the new edition of the book make some challenging material more easily accessible to undergraduate and beginning graduate students. We assume only calculus as a prerequisite. Throughout the chapter, as examples, certain important results from such areas as queuing theory, random walks, branching processes, superposition of Poisson processes, and compound Poisson processes are discussed. I have also explained what the famous theorem, PASTA, Poisson Arrivals See Time Average, states. In short, the chapter on stochastic processes is laying the foundation on which students' further pure and applied probability studies and work can build.
 Some practical, meaningful, nontrivial, and relevant applications of probability and stochastic processes in finance, economics, and actuarial sciences are presented.
 Ever since 1853, when Gregor Johann Mendel (18221884) began his breeding experiments with the garden pea Pisum sativum, probability has played an importaut role in the understanding of the principles of heredity. In this edition, I have included more genetics examples to demonstrate the extent of that role.
 To study the risk or rate of "failure," per unit of time of "lifetimes" that have already survived a certain length of time, I have added a new section, Survival Analysis and Hazard Functions, to Chapter 7.
 For random sums of random variables, I have discussed Wald's equation and its analogous case for variance. Certain applications of Wald's equation have been discussed in the exercises, as well as in Chapter 12, Stochastic Processes.
 To make the order of topics more natural, the previous editions' Chapter 8 is broken into two separate chapters, Bivariate Distributions and Multivariate Distributions. As a result, the section Transformations of Two Random Variables has been covered earlier along with the material on bivariate distributions, and the convolution theorem has found a better home as an example of transformation methods. That theorem is now presented as a motivation for introducing momentgenerating functions, since it cannot be extended so easily to many random variables.
For a oneterm course on probability, instructors have been able to omit many sections without difficulty. The book is designed for students with different levels of ability, and a variety of probability courses, applied and/or pure, can be taught using this book. A typical onesemester course on probability would cover Chapters 1 and 2; Sections 3.13.5; Chapters 4, 5, 6; Sections 7.17.4; Sections 8.18.3; Section 9.1; Sections 10.110.3; and Chapter 11.
A followup course on introductory stochastic processes, or on a more advanced probability would cover the remaining material in the book with an emphasis on Sections 8.4, 9.29.3, 10.4 and, especially, the entire Chapter 12.
A course on discrete probability would cover Sections 1.11.5; Chapters 2, 3, 4, and 5; The subsections Joint Probability Mass Functions, Independence of Discrete Random Variables, and Conditional Distributions: Discrete Case, from Chapter 8; the subsection Joint Probability Mass Functions, from Chapter 9; Section 9.3; selected discrete topics from Chapters 10 and 11; and Section 12.3.
Web Site
For the issues concerning this book, such as reviews and errata, the Web site
http://mars.wnec.edu/~sghahram/probabilitybooks.html
is established. In this Web site, I may also post new examples, exercises, and topics that I will write for future editions.
Solutions Manual
I have written an Instructor's Solutions Manual that gives detailed solutions to virtually all of the 1224 exercises of the book. This manual is available, directly from Prentice Hall, only for those instructors who teach their courses from this book.
Table of Contents
Preface  xi  
1  Axioms of Probability  1 
1.1  Introduction  1 
1.2  Sample Space and Events  4 
1.3  Axioms of Probability  11 
1.4  Basic Theorems  18 
1.5  Continuity of Probability Function  27 
1.6  Probabilities 0 and 1  29 
1.7  Random Selection of Points from Intervals  31 
Review Problems  35  
2  Combinatorial Methods  38 
2.1  Introduction  38 
2.2  Counting Principle  38 
Number of Subsets of a Set  42  
Tree Diagrams  43  
2.3  Permutations  47 
2.4  Combinations  54 
2.5  Stirling's Formula  70 
Review Problems  72  
3  Conditional Probability and Independence  75 
3.1  Conditional Probability  75 
Reduction of Sample Space  79  
3.2  Law of Multiplication  85 
3.3  Law of Total Probability  89 
3.4  Bayes' Formula  99 
3.5  Independence  106 
Review Problems  126  
4  Distribution Functions and Discrete Random Variables  129 
4.1  Random Variables  129 
4.2  Distribution Functions  134 
4.3  Discrete Random Variables  143 
4.4  Expectations of Discrete Random Variables  150 
4.5  Variances and Moments of Discrete Random Variables  161 
Moments  167  
4.6  Standardized Random Variables  170 
Review Problems  171  
5  Special Discrete Distributions  174 
5.1  Bernoulli and Binomial Random Variables  174 
Expectations and Variances of Binomial Random Variables  180  
5.2  Poisson Random Variable  188 
Poisson as an Approximation to Binomial  188  
Poisson Process  194  
5.3  Other Discrete Random Variables  203 
Geometric Random Variable  203  
Negative Binomial Random Variable  205  
Hypergeometric Random Variable  207  
Review Problems  215  
6  Continuous Random Variables  218 
6.1  Probability Density Functions  218 
6.2  Density Function of a Function of a Random Variable  227 
6.3  Expectations and Variances  233 
Expectations of Continuous Random Variables  233  
Variances of Continuous Random Variables  239  
Review Problems  245  
7  Special Continuous Distributions  247 
7.1  Uniform Random Variable  247 
7.2  Normal Random Variable  254 
Correction for Continuity  257  
7.3  Exponential Random Variable  269 
7.4  Gamma Distribution  276 
7.5  Beta Distribution  281 
Review Problems  285  
8  Joint distributions  287 
8.1  Bivariate Distributions  287 
Joint Probability Functions  287  
Joint Probability Density Functions  290  
8.2  Independent Random Variables  305 
8.3  Conditional Distributions  316 
8.4  Multivariate Distributions  329 
8.5  Order Statistics  345 
8.6  Multinomial Distributions  352 
8.7  Transformations of Two Random Variables  356 
Review Problems  362  
9  More Expectations and Variances  367 
9.1  Expected Values of Sums of Random Variables  367 
Pattern Appearance  376  
9.2  Covariance  383 
9.3  Correlation  392 
9.4  Conditioning on Random Variables  397 
9.5  Bivariate Normal Distribution  408 
Review Problems  414  
10  Sums of Independent Random Variables and Limit Theorems  416 
10.1  MomentGenerating Functions  416 
10.2  Sums of Independent Random Variables  427 
10.3  Markov and Chebyshev Inequalities  437 
10.4  Laws of Large Numbers  443 
10.5  Central Limit Theorem  452 
Review Problems  458  
11  Simulation  461 
11.1  Introduction  461 
11.2  Simulation of Combinatorial Problems  466 
11.3  Simulation of Conditional Probabilities  469 
11.4  Simulation of Random Variables  473 
11.5  Monte Carlo Method  481 
Appendix Tables  487  
Answers to OddNumbered Exercises  491  
Index  501 
Preface
This one or twoterm basic probability text is written for majors in mathematics, physical sciences, engineering, statistics, actuarial science, business and finance, operations research, and computer science. It can also be used by students who have completed a basic calculus course. Our aim is to present probability in a natural way: through interesting and instructive examples and exercises that motivate the theory, definitions, theorems, and methodology. Examples and exercises have been carefully designed to arouse curiosity and hence encourage the students to delve into the theory with enthusiasm.
Authors are usually faced with two opposing impulses. One is a tendency to put too much into the book, because everything is important and everything has to be said the author's way! On the other hand, authors must also keep in mind a clear definition of the focus, the level, and the audience for the book, thereby choosing carefully what should be "in" and what "out." Hopefully, this book is an acceptable resolution of the tension generated by these opposing forces.
Instructors should enjoy the versatility of this text. They can choose their favorite problems and exercises from a collection of 1558 and, if necessary, omit some sections and/or theorems to teach at an appropriate level.
Exercises for most sections are divided into two categories: A and B. Those in category A are routine, and those in category B are challenging. However, not all exercises in category B are uniformly challenging. Some of those exercises are included because students find them somewhat difficult.
I have tried to maintain an approach that is mathematically rigorous and, at the same time, closely matches the historical development of probability. Whenever appropriate, I include historical remarks, and also include discussions of a number of probability problems published in recent years in journals such as Mathematics Magazine and American Mathematical Monthly. These are interesting and instructive problems that deserve discussion in classrooms.
Chapter 13 concerns computer simulation. That chapter is divided into several sections, presenting algorithms that are used to find approximate solutions to complicated probabilistic problems. These sections can be discussed independently when relevant materials from earlier chapters are being taught, or they can be discussed concurrently, toward the end of the semester. Although I believe that the emphasis should remain on concepts, methodology, and the mathematics of the subject, I also think that students should be asked to read the material on simulation and perhaps do some projects. Computer simulation is an excellent means to acquire insight into the nature of a problem, its functions, its magnitude, and the characteristics of the solution.
Other Continuing Features
 The historical roots and applications of many of the theorems and definitions are presented in detail, accompanied by suitable examples or counterexamples.
 As much as possible, examples and exercises for each section do not refer to exercises in other chapters or sections—a style that often frustrates students and instructors.
 Whenever a new concept is introduced, its relationship to preceding concepts and theorems is explained.
 Although the usual analytic proofs are given, simple probabilistic arguments are presented to promote deeper understanding of the subject.
 The book begins with discussions on probability and its definition, rather than with combinatorics. I believe that combinatorics should be taught after students have learned the preliminary concepts of probability. The advantage of this approach is that the need for methods of counting will occur naturally to students, and the connection between the two areas becomes clear from the beginning. Moreover, combinatorics becomes more interesting and enjoyable.
 Students beginning their study of probability have a tendency to think that sample spaces always have a finite number of sample points. To minimize this proclivity, the concept of random selection of a point from an interval is introduced in Chapter 1 and applied where appropriate throughout the, book. Moreover, since the basis of simulating indeterministic problems is selection of random points from (0, 1), in order to understand simulations, students need to be thoroughly familiar with that concept.
 Often, when we think of a collection of events, we have a tendency to think about them in either temporal or logical sequence. So, if, for example, a sequence of events A_{1}, A_{2}, . . . , A_{n} occur in time or in some logical order, we can usually immediately write down the probabilities P(A_{1}), P(A_{2} A_{1}), ..., P(A_{n} A_{1}A_{2} . . . A_{n1}) without much computation. However, we may be interested in probabilities of the intersection of events, or probabilities of events unconditional on the rest, or probabilities of earlier events, given later events. These three questions motivated the need for the law of multiplication, the law of total probability, and B ayes' theorem. I have given the law of multiplication a section of its own so that each of these fundamental uses of conditional probability would have its full share of attention and coverage.
 The concepts of expectation and variance are introduced early, because important concepts should be defined and used as soon as possible. One benefit of this practice is that, when random variables such as Poisson and normal are studied, the associated parameters will be understood immediately rather than remaining ambiguous until expectation and variance are introduced. Therefore, from the beginning, students will develop a natural feeling about such parameters.
 Special attention is paid to the Poisson distribution; it is made clear that this distribution is frequently applicable, for two reasons: first, because it approximates the binomial distribution and, second, it is the mathematical model for an enormous class of phenomena. The comprehensive presentation of the Poisson process and its applications can be understood by junior and seniorlevel students.
 Students often have difficulties understanding functions or quantities such as the density function of a continuous random variable and the formula for mathematical expectation. For example, they may wonder why f xf (x) dx is the appropriate definition for E(X) and why correction for continuity is necessary. I have explained the reason behind such definitions, theorems, and concepts, and have demonstrated why they are the natural extensions of discrete cases.
 The first six chapters include many examples and exercises concerning selection of random points from intervals. Consequently, in Chapter 7, when discussing uniform random variables, I have been able to calculate the distribution and (by differentiation) the density function of X, a random point from an interval (a, b). In this way the concept of a uniform random variable and the definition of its density function are readily motivated.
 In Chapters 7 and 8 the usefulness of uniform densities is shown by using many examples. In particular, applications of uniform density in geometric probability theory are emphasized.
 Normal density, arguably the most important density function, is readily motivated by De Moivre's theorem. In Section 7.2, I introduce the standard normal density, the elementary version of the central limit theorem, and the normal density just as they were developed historically. Experience shows this to be a good pedagogical approach. When teaching this approach, the normal density becomes natural and does not look like a strange function appearing out of the blue.
 Exponential random variables naturally occur as times between consecutive events of Poisson processes. The time of occurrence of the nth event of a Poisson process has a gamma distribution. For these reasons I have motivated exponential and gamma distributions by Poisson processes. In this way we can obtain many examples of exponential and gamma random variables from the abundant examples of Poisson processes already known. Another advantage is that it helps us visualize memoryless random variables by looking at the interevent times of Poisson processes.
 Joint distributions and conditioning are often trouble areas for students. A detailed explanation and many applications concerning these concepts and techniques make these materials somewhat easier for students to understand.
 The concepts of covariance and correlation are motivated thoroughly.
 A subsection on pattern appearance is presented in Section 10.1. Even though the method discussed in this subsection is intuitive and probabilistic, it should help the students understand such paradoxicallooking results as the following. On the average, it takes almost twice as many flips of a fair coin to obtain a sequence of five successive heads as it does to obtain a tail followed by four heads.
 The answers to the oddnumbered exercises are included at the end of the book.
New To This Edition
Since 2000, when the second edition of this book was published, I have received much additional correspondence and feedback from faculty and students in this country and abroad. The comments, discussions, recommendations, and reviews helped me to improve the book in many ways. All detected errors were corrected, and the text has been finetuned for accuracy. More explanations and clarifying comments have been added to almost every section. In this edition, 278 new exercises and examples, mostly of an applied nature, have been added. More insightful and better solutions are given for a number of problems and exercises. For example, I have discussed Borel's normal number theorem, and I have presented a version of a famous set which is not an event. If a fair coin is tossed a very large number of times, the general perception is that heads occurs as often as tails. In a new subsection, in Section 11.4, I have explained what is meant by "heads occurs as often as tails."
Some of the other features of the present revision are the following:
 An introductory chapter on stochastic processes is added. That chapter covers more indepth material on Poisson processes. It also presents the basics of Markov chains, continuoustime Markov chains, and Brownian motion. The topics are covered in some depth. Therefore, the current edition has enough material for a second course in probability as well. The level of difficulty of the chapter on stochastic processes is consistent with the rest of the book. I believe the explanations in the new edition of the book make some challenging material more easily accessible to undergraduate and beginning graduate students. We assume only calculus as a prerequisite. Throughout the chapter, as examples, certain important results from such areas as queuing theory, random walks, branching processes, superposition of Poisson processes, and compound Poisson processes are discussed. I have also explained what the famous theorem, PASTA, Poisson Arrivals See Time Average, states. In short, the chapter on stochastic processes is laying the foundation on which students' further pure and applied probability studies and work can build.
 Some practical, meaningful, nontrivial, and relevant applications of probability and stochastic processes in finance, economics, and actuarial sciences are presented.
 Ever since 1853, when Gregor Johann Mendel (18221884) began his breeding experiments with the garden pea Pisum sativum, probability has played an importaut role in the understanding of the principles of heredity. In this edition, I have included more genetics examples to demonstrate the extent of that role.
 To study the risk or rate of "failure," per unit of time of "lifetimes" that have already survived a certain length of time, I have added a new section, Survival Analysis and Hazard Functions, to Chapter 7.
 For random sums of random variables, I have discussed Wald's equation and its analogous case for variance. Certain applications of Wald's equation have been discussed in the exercises, as well as in Chapter 12, Stochastic Processes.
 To make the order of topics more natural, the previous editions' Chapter 8 is broken into two separate chapters, Bivariate Distributions and Multivariate Distributions. As a result, the section Transformations of Two Random Variables has been covered earlier along with the material on bivariate distributions, and the convolution theorem has found a better home as an example of transformation methods. That theorem is now presented as a motivation for introducing momentgenerating functions, since it cannot be extended so easily to many random variables.
Sample Syllabi
For a oneterm course on probability, instructors have been able to omit many sections without difficulty. The book is designed for students with different levels of ability, and a variety of probability courses, applied and/or pure, can be taught using this book. A typical onesemester course on probability would cover Chapters 1 and 2; Sections 3.13.5; Chapters 4, 5, 6; Sections 7.17.4; Sections 8.18.3; Section 9.1; Sections 10.110.3; and Chapter 11.
A followup course on introductory stochastic processes, or on a more advanced probability would cover the remaining material in the book with an emphasis on Sections 8.4, 9.29.3, 10.4 and, especially, the entire Chapter 12.
A course on discrete probability would cover Sections 1.11.5; Chapters 2, 3, 4, and 5; The subsections Joint Probability Mass Functions, Independence of Discrete Random Variables, and Conditional Distributions: Discrete Case, from Chapter 8; the subsection Joint Probability Mass Functions, from Chapter 9; Section 9.3; selected discrete topics from Chapters 10 and 11; and Section 12.3.
Web Site
For the issues concerning this book, such as reviews and errata, the Web site
http://mars.wnec.edu/~sghahram/probabilitybooks.html
is established. In this Web site, I may also post new examples, exercises, and topics that I will write for future editions.
Solutions Manual
I have written an Instructor's Solutions Manual that gives detailed solutions to virtually all of the 1224 exercises of the book. This manual is available, directly from Prentice Hall, only for those instructors who teach their courses from this book.