
Probability, Statistics, and Random Processes For Electrical Engineering / Edition 3
832
Probability, Statistics, and Random Processes For Electrical Engineering / Edition 3
832Paperback(New Edition)
-
SHIP THIS ITEMIn stock. Ships in 1-2 days.PICK UP IN STORE
Your local store may have stock of this item.
Available within 2 business hours
Related collections and offers
Overview
Product Details
ISBN-13: | 9780131471221 |
---|---|
Publisher: | Pearson Education |
Publication date: | 12/28/2007 |
Series: | Alternative eText Formats Series |
Edition description: | New Edition |
Pages: | 832 |
Product dimensions: | 6.90(w) x 9.10(h) x 1.70(d) |
Read an Excerpt
PREFACE:
Probability and Random Processes for Electrical Engineering presents a carefully motivated, accessible, and interesting introduction to probability and random processes. It is designed to allow the instructor maximum flexibility in the selection of topics. In addition to the standard topics taught in introductory courses on probability, random variables, and random processes, the book includes sections on modeling, basic statistical techniques, computer simulation, reliability, and entropy, as well as concise but relatively complete introductions to Markov chains and queueing theory.
The complexity of the systems encountered in electrical and computer engineering calls for an understanding of probability concepts and a facility in the use of probability tools from an increasing number of B.S. degree graduates. The introductory Course should therefore teach the student not only the basic theoretical concepts but also how to solve problems that arise in engineering practice. This course requires that the student develop problem-solving skills and understand how to make the transition from a real problem to a probability model for that problem.
Relevance to Engineering Practice
Motivating students is a major challenge in introductory probability courses. Instructors need to respond by showing students the relevance of probability theory to engineering practice. Chapter 1 addresses this challenge by discussing the role of probability models in engineering design. Practical applications from various areas of electrical and computer engineering are used to show how averages and relative frequencies provide theproper tools for handling the design of systems that involve randomness. These application areas are used in examples and problems throughout the text.
From Problems to Probability Models
The transition from real problems to probability models is shown in several ways. First, important concepts are usually developed by presenting real data or computer-simulated data. Second, sections on basic statistical techniques are integrated throughout the text. These sections demonstrate how statistical methods provide the link between theory and the real world. Finally, the significant random variables and random processes are developed using model-building arguments that range from simple to complex. For example, in Chapter 2 and 3, text discussion proceeds from coin tossing to Bernoulli trials. It then continues to the binomial and geometric distributions, and finally proceeds via limiting arguments to the Poisson, exponential, and Gaussian distributions.
Examples and Problems
Numerous examples in every section are used to demonstrate analytical and problem-solving techniques, develop concepts using simplified cases, and illustrate applications. The text includes over 700 problems, identified by section to help the instructor select homework problems. Additional sets of problems requiring cumulative knowledge are provided at the end of each chapter. Answers to selected problems are included at the end of the text. A Student Solutions Manual accompanies this text to develop problem-solving skills. A sampling of 25% of carefully worked out problems has been selected to help students understand concepts presented in the text. An Instructors Solutions Manual with complete solutions is also available.
Computer Methods
The development of an intuition for randomness can be aided by the use of computer exercises. Appendix C contains computer programs for generating several well-known random variables. The resulting data from computer-generated random numbers and variables can be analyzed using the statistical methods introduced in the text.
Sections on computer methods have been integrated into the text rather than isolated in a separate chapter because performing the computer exercises during lessons helps students to learn basic probability concepts. It should be noted that the computer methods introduced in Sections 2.7, 3.11, and 4.10 do not necessarily require entirely new lectures. The transformation method in Section 3.11 can be incorporated into the discussion on functions of a random variable. Similarly, the material in Section 4.10 can be incorporated into the discussion on transformations of random vectors.
Random Variables and Continuous-Time Random Processes
Discrete-time random processes provide a crucial "bridge" in going from random variables to continuous-time random processes. Care is taken in the first five chapters to lay the proper groundwork for this transition. Thus sequences of dependent experiments are discussed in Chapter 2 as a preview of Markov chains. In Chapter 4, emphasis is placed on how a joint distribution generates a consistent family of marginal distributions. Chapter 5 introduces sequences of independent identically distributed (iid) random variables. Chapter 6 considers the sum of an iid sequence to produce important examples of random processes. Throughout Chapters 6 and 7, a concise development of the concepts is achieved by developing discrete-time and continuous-time results in parallel.
Markov Chains and Queueing Theory
Markov chains and queueing theory have become essential tools in communication network and computer system modeling. In the introductory course on probability only a few changes need to be made to accommodate these new requirements. The treatment of conditional probability and conditional expectation needs to be modified, and the Poisson and gamma random variables need to be given greater prominence. In an introductory course on random processes a new balance needs to be struck between the traditional discussion of wide-sense stationary processes and linear systems and the discussion of Markov chains and queueing theory. The "optimum" balance between these two needs will surely vary from instructor to instructor, so the text includes more material than can be covered in one semester in order to give the instructor leeway to strike a balance.
Suggested Syllabi
The first five chapters form the basis of a one-semester introduction to probability. In addition to the optional sections on computer methods, these chapters also include optional sections on combinatorics, reliability, confidence intervals, and basic results from renewal theory. In a one-semester course, it is possible to provide an introduction to random processes by omitting all the starred sections in the first five chapters and covering instead the first part of Chapter 6. The material in the first five chapters has been used at the University of Toronto in an introductory junior-level required course for electrical engineers.
A one-semester course on random processes with Markov chains can be taught using Chapters 6 though 8. A quick introduction to Markov chains and queueing theory is possible by covering only the first three sections of Chapter 8 and then proceeding to the first few sections in Chapter 9. A one-semester introduction to queueing theory can be taught from Chapters 6, 8, and 9.
Changes in the Second Edition
The only changes in the second edition that affect the first half of the book, and hence introductory courses on probability, involve the addition of more examples and problems. In keeping with our goal of giving the instructor flexibility in the selection of topics, we have expanded the optional section on reliability (Section 3.10) and introduced a new optional section on entropy (Section 3.12). Care has been taken not just to define the various quantities associated with entropy but also to develop an understanding of the interpretation of entropy as a measure of uncertainty and information.
The most significant change to the second edition is the addition of material to make the text more suitable for a course that provides a more substantial introduction to random processes:
- In Chapter 4, a section on the joint characteristic function has been added and the discussion of jointly Gaussian random variables has been expanded.
- Section 5.5 discusses the various types of convergence of sequences of random variables. A carefully selected set of examples is presented to demonstrate the differences in the various types of convergence.
- Section 6.6 uses these results to develop the notions of mean square continuity, derivatives, and integrals of random processes. This section presents the relations between the Wiener process and white Gaussian noise. It also develops the Ornstein-Uhlenbeck process as the transient solution to a first-order linear system driven by noise.
- Section 6.8 uses Fourier series to introduce the notion of representing a random process by a linear combination of deterministic functions weighted by random variables. It then proceeds to develop the Karhunen-Loeve expansion for vector random variables and then random processes.
- Section 7.4 now contains a separate section on prediction and the Levinson algorithm.
- Finally, Section 7.5 presents a discussion of the Kalman filter to complement the Wiener filter introduced in Section 7.4.
Acknowledgments
I would like to acknowledge the help of several individuals in the preparations of the second edition. First and foremost, I must thank the users of the first edition, both professors and students, who provided many of the suggestions incorporated into this edition. I would also like to thank my graduate students for providing feedback on parts of the manuscript, especially Masoud Khansari and Sameh Sowelam, who took a special interest. I also thank Indra Widjaja for preparing the programs to generate random variables. My colleagues, Professors Frank Kschischang, Pas Pasupathy, and Dimitrios Hatzinakos, provided useful comments and suggestions.
The following reviewers aided me with their suggestions and comments in this second edition: Steven A. Tretter, University of Maryland-College Park; James Bucklew, University of Wisconsin-Madison; Subhash Kak, Louisiana State University-Baton Rouge; Deniz Sandhu; Ronald Iltis, University of California-Santa Barbara; Dr. Benrat Devarajan, University of Texas-Arlington; Shih-Chun Chang, George Mason University.
Most of all I would like to thank my wife, Karen Carlyle, for her love and support.
Table of Contents
1. Probability Models in Electrical and Computer Engineering.Mathematical models as tools in analysis and design. Deterministic models. Probability models.
Statistical regularity. Properties of relative frequency. The axiomatic approach to a theory of probability. Building a probability model.
A detailed example: a packet voice transmission system. Other examples.
Communication over unreliable channels. Processing of random signals. Resource sharing systems. Reliability of systems.
Overview of book. Summary. Problems.
2. Basic Concepts of Probability Theory.
Specifying random experiments.
The sample space. Events. Set operations.
The axioms of probability.
Discrete sample spaces. Continuous sample spaces.
Computing probabilities using counting methods.
Sampling with replacement and with ordering. Sampling without replacement and with ordering. Permutations of n distinct objects. Sampling without replacement and without ordering. Sampling with replacement and without ordering.
Conditional probability.
Bayes' Rule.
Independence of events. Sequential experiments.
Sequences of independent experiments. The binomial probability law. The multinomial probability law. The geometric probability law. Sequences of dependent experiments.
A computer method for synthesizing randomness: random number generators. Summary. Problems.
3. Random Variables.
The notion of a random variable. The cumulative distribution function.
The three types of random variables.
The probability density function.
Conditional cdf's and pdf's.
Some important random variables.
Discrete random variables. Continuous random variables.
Functions of a random variable. The expected value of random variables.
The expected value of X. The expected value of Y = g(X). Variance of X.
The Markov and Chebyshev inequalities. Testing the fit of a distribution to data. Transform methods.
The characteristic function. The probability generating function. The laplace transform of the pdf.
Basic reliability calculations.
The failure rate function. Reliability of systems.
Computer methods for generating random variables.
The transformation method. The rejection method. Generation of functions of a random variable. Generating mixtures of random variables.
Entropy.
The entropy of a random variable. Entropy as a measure of information. The method of a maximum entropy.
Summary. Problems.
4. Multiple Random Variables.
Vector random variables.
Events and probabilities. Independence.
Pairs of random variables.
Pairs of discrete random variables. The joint cdf of X and Y. The joint pdf of two jointly continuous random variables. Random variables that differ in type.
Independence of two random variables. Conditional probability and conditional expectation.
Conditional probability. Conditional expectation.
Multiple random variables.
Joint distributions. Independence.
Functions of several random variables.
One function of several random variables. Transformation of random vectors. pdf of linear transformations. pdf of general transformations.
Expected value of functions of random variables.
The correlation and covariance of two random variables. Joint characteristic function.
Jointly Gaussian random variables.
n jointly Gaussian random variables. Linear transformation of Gaussian random variables. Joint characteristic function of Gaussian random variables.
Mean square estimation.
Linear prediction.
Generating correlated vector random variables.
Generating vectors of random variables with specified covariances. Generating vectors of jointly Gaussian random variables.
Summary. Problems.
5. Sums of Random Variables and Long-Term Averages.
Sums of random variables.
Mean and variance of sums of random variables. pdf of sums of independent random variables. Sum of a random number of random variables.
The sample mean and the laws of large numbers. The central limit theorem.
Gaussian approximation for binomial probabilities. Proof of the central limit theorem.
Confidence intervals.
Case 1: Xj's Gaussian; unknown mean and known variance. Case 2: Xj's Gaussian; mean and variance unknown. Case 3: Xj's Non-Gaussian; mean and variance unknown.
Convergence of sequences of random variables. Long-term arrival rates and associated averages. Long-term time averages. A computer method for evaluating the distribution of a random variable using the discrete Fourier transform. Discrete random variables. Continuous random variables. Summary. Problems. Appendix: subroutine FFT(A,M,N).
6. Random Processes.
Definition of a random process. Specifying of a random process.
Joint distributions of time samples. The mean, autocorrelation, and autocovariance functions. Gaussian random processes. Multiple random processes.
Examples of discrete-time random processes.
iid random processes. Sum processes; the binomial counting and random walk processes.
Examples of continuous-time random processes.
Poisson process. Random telegraph signal and other processes derived from the Poisson Process. Wiener process and Brownian motion.
Stationary random processes.
Wide-sense stationary random processes. Wide-sense stationary Gaussian random processes. Cylostationary random processes.
Continuity, derivative, and integrals of random processes.
Mean square continuity. Mean square derivatives. Mean square integrals. Response of a linear system to random input.
Time averages of random processes and ergodic theorems. Fourier series and Karhunen-Loeve expansion.
Karhunen-Loeve expansion.
Summary. Problems.
7. Analysis and Processing of Random Signals.
Power spectral density.
Continuous-time random processes. Discrete-time random processes. Power spectral density as a time average.
Response of linear systems to random signals.
Continuous-time systems. Discrete-time systems.
Amplitude modulation by random signals. Optimum linear systems.
The orthogonality condition. Prediction. Estimation using the entire realization of the observed process. Estimation using causal filters.
The Kalman filter. Estimating the power spectral density.
Variance of periodogram estimate. Smoothing of periodogram estimate.
Summary. Problems.
8. Markov Chains.
Markov processes. Discrete-time Markov chains.
The n-step transition probabilities. The state probabilities. Steady state probabilities.
Continuous-time Markov chains.
State occupancy times. Transition rates and time-dependent state probabilities. Steady state probabilities and global balance equations.
Classes of states, recurrence properties, and limiting probabilities.
Classes of states. Recurrence properties. Limiting probabilities. Limiting probabilities for continuous-time Markov chains.
Time-reversed Markov chains.
Time-reversible Markov chains. Time-reversible continuous-time Markov chains.
Summary. Problems.
9. Introduction to Queueing Theory.
The elements of a queueing system. Little's formula. The M/M/I queue.
Distribution of number in the system. Delay distribution in M/M/I system and arriving customer's distribution. The M/M/I system with finite capacity.
Multi-server systems: M/M/c, M/M/c/c, and M/M/infinity.
Distribution of number in the M/M/c system. Waiting time distribution for M/M/c. The M/M/c/c queueing system. The M/M/infinity queueing system.
Finite-source queueing systems.
Arriving customer's distribution.
M/G/I queueing systems.
The residual service time. Mean delay in M/G/I systems. Mean delay in M/G/I systems with priority service discipline.
M/G/I analysis using embedded Markov chains.
The embedded Markov chains. The number of customers in an M/G/I system. Delay and waiting time distribution in an M/G/I system.
Burke's theorem: Departures from M/M/c systems Proof of Burke's theorem using time reversibility. Networks of queues: Jackson's theorem.
Open networks of queues. Proof of Jackson's theorem. Closed networks of queues. Mean value analysis. Proof of the arrival theorem.
Summary. Problems.
Appendix A. Mathematical Tables.
Appendix B. Tables of Fourier Transformation.
Appendix C. Computer Programs for Generating Random Variables.
Answers to Selected Problems.
Index.
Preface
Probability and Random Processes for Electrical Engineering presents a carefully motivated, accessible, and interesting introduction to probability and random processes. It is designed to allow the instructor maximum flexibility in the selection of topics. In addition to the standard topics taught in introductory courses on probability, random variables, and random processes, the book includes sections on modeling, basic statistical techniques, computer simulation, reliability, and entropy, as well as concise but relatively complete introductions to Markov chains and queueing theory.
The complexity of the systems encountered in electrical and computer engineering calls for an understanding of probability concepts and a facility in the use of probability tools from an increasing number of B.S. degree graduates. The introductory Course should therefore teach the student not only the basic theoretical concepts but also how to solve problems that arise in engineering practice. This course requires that the student develop problem-solving skills and understand how to make the transition from a real problem to a probability model for that problem.
Relevance to Engineering Practice
Motivating students is a major challenge in introductory probability courses. Instructors need to respond by showing students the relevance of probability theory to engineering practice. Chapter 1 addresses this challenge by discussing the role of probability models in engineering design. Practical applications from various areas of electrical and computer engineering are used to show how averages and relative frequencies providetheproper tools for handling the design of systems that involve randomness. These application areas are used in examples and problems throughout the text.
From Problems to Probability Models
The transition from real problems to probability models is shown in several ways. First, important concepts are usually developed by presenting real data or computer-simulated data. Second, sections on basic statistical techniques are integrated throughout the text. These sections demonstrate how statistical methods provide the link between theory and the real world. Finally, the significant random variables and random processes are developed using model-building arguments that range from simple to complex. For example, in Chapter 2 and 3, text discussion proceeds from coin tossing to Bernoulli trials. It then continues to the binomial and geometric distributions, and finally proceeds via limiting arguments to the Poisson, exponential, and Gaussian distributions.
Examples and Problems
Numerous examples in every section are used to demonstrate analytical and problem-solving techniques, develop concepts using simplified cases, and illustrate applications. The text includes over 700 problems, identified by section to help the instructor select homework problems. Additional sets of problems requiring cumulative knowledge are provided at the end of each chapter. Answers to selected problems are included at the end of the text. A Student Solutions Manual accompanies this text to develop problem-solving skills. A sampling of 25% of carefully worked out problems has been selected to help students understand concepts presented in the text. An Instructors Solutions Manual with complete solutions is also available.
Computer Methods
The development of an intuition for randomness can be aided by the use of computer exercises. Appendix C contains computer programs for generating several well-known random variables. The resulting data from computer-generated random numbers and variables can be analyzed using the statistical methods introduced in the text.
Sections on computer methods have been integrated into the text rather than isolated in a separate chapter because performing the computer exercises during lessons helps students to learn basic probability concepts. It should be noted that the computer methods introduced in Sections 2.7, 3.11, and 4.10 do not necessarily require entirely new lectures. The transformation method in Section 3.11 can be incorporated into the discussion on functions of a random variable. Similarly, the material in Section 4.10 can be incorporated into the discussion on transformations of random vectors.
Random Variables and Continuous-Time Random Processes
Discrete-time random processes provide a crucial "bridge" in going from random variables to continuous-time random processes. Care is taken in the first five chapters to lay the proper groundwork for this transition. Thus sequences of dependent experiments are discussed in Chapter 2 as a preview of Markov chains. In Chapter 4, emphasis is placed on how a joint distribution generates a consistent family of marginal distributions. Chapter 5 introduces sequences of independent identically distributed (iid) random variables. Chapter 6 considers the sum of an iid sequence to produce important examples of random processes. Throughout Chapters 6 and 7, a concise development of the concepts is achieved by developing discrete-time and continuous-time results in parallel.
Markov Chains and Queueing Theory
Markov chains and queueing theory have become essential tools in communication network and computer system modeling. In the introductory course on probability only a few changes need to be made to accommodate these new requirements. The treatment of conditional probability and conditional expectation needs to be modified, and the Poisson and gamma random variables need to be given greater prominence. In an introductory course on random processes a new balance needs to be struck between the traditional discussion of wide-sense stationary processes and linear systems and the discussion of Markov chains and queueing theory. The "optimum" balance between these two needs will surely vary from instructor to instructor, so the text includes more material than can be covered in one semester in order to give the instructor leeway to strike a balance.
Suggested Syllabi
The first five chapters form the basis of a one-semester introduction to probability. In addition to the optional sections on computer methods, these chapters also include optional sections on combinatorics, reliability, confidence intervals, and basic results from renewal theory. In a one-semester course, it is possible to provide an introduction to random processes by omitting all the starred sections in the first five chapters and covering instead the first part of Chapter 6. The material in the first five chapters has been used at the University of Toronto in an introductory junior-level required course for electrical engineers.
A one-semester course on random processes with Markov chains can be taught using Chapters 6 though 8. A quick introduction to Markov chains and queueing theory is possible by covering only the first three sections of Chapter 8 and then proceeding to the first few sections in Chapter 9. A one-semester introduction to queueing theory can be taught from Chapters 6, 8, and 9.
Changes in the Second Edition
The only changes in the second edition that affect the first half of the book, and hence introductory courses on probability, involve the addition of more examples and problems. In keeping with our goal of giving the instructor flexibility in the selection of topics, we have expanded the optional section on reliability (Section 3.10) and introduced a new optional section on entropy (Section 3.12). Care has been taken not just to define the various quantities associated with entropy but also to develop an understanding of the interpretation of entropy as a measure of uncertainty and information.
The most significant change to the second edition is the addition of material to make the text more suitable for a course that provides a more substantial introduction to random processes:
- In Chapter 4, a section on the joint characteristic function has been added and the discussion of jointly Gaussian random variables has been expanded.
- Section 5.5 discusses the various types of convergence of sequences of random variables. A carefully selected set of examples is presented to demonstrate the differences in the various types of convergence.
- Section 6.6 uses these results to develop the notions of mean square continuity, derivatives, and integrals of random processes. This section presents the relations between the Wiener process and white Gaussian noise. It also develops the Ornstein-Uhlenbeck process as the transient solution to a first-order linear system driven by noise.
- Section 6.8 uses Fourier series to introduce the notion of representing a random process by a linear combination of deterministic functions weighted by random variables. It then proceeds to develop the Karhunen-Loeve expansion for vector random variables and then random processes.
- Section 7.4 now contains a separate section on prediction and the Levinson algorithm.
- Finally, Section 7.5 presents a discussion of the Kalman filter to complement the Wiener filter introduced in Section 7.4.
Acknowledgments
I would like to acknowledge the help of several individuals in the preparations of the second edition. First and foremost, I must thank the users of the first edition, both professors and students, who provided many of the suggestions incorporated into this edition. I would also like to thank my graduate students for providing feedback on parts of the manuscript, especially Masoud Khansari and Sameh Sowelam, who took a special interest. I also thank Indra Widjaja for preparing the programs to generate random variables. My colleagues, Professors Frank Kschischang, Pas Pasupathy, and Dimitrios Hatzinakos, provided useful comments and suggestions.
The following reviewers aided me with their suggestions and comments in this second edition: Steven A. Tretter, University of Maryland-College Park; James Bucklew, University of Wisconsin-Madison; Subhash Kak, Louisiana State University-Baton Rouge; Deniz Sandhu; Ronald Iltis, University of California-Santa Barbara; Dr. Benrat Devarajan, University of Texas-Arlington; Shih-Chun Chang, George Mason University.
Most of all I would like to thank my wife, Karen Carlyle, for her love and support.