Introduction to Probability and Statistics: Principles and Applications for Engineering and the Computing Sciences / Edition 4

Introduction to Probability and Statistics: Principles and Applications for Engineering and the Computing Sciences / Edition 4

ISBN-10:
007246836X
ISBN-13:
9780072468366
Pub. Date:
09/30/2002
Publisher:
McGraw Hill LLC
ISBN-10:
007246836X
ISBN-13:
9780072468366
Pub. Date:
09/30/2002
Publisher:
McGraw Hill LLC
Introduction to Probability and Statistics: Principles and Applications for Engineering and the Computing Sciences / Edition 4

Introduction to Probability and Statistics: Principles and Applications for Engineering and the Computing Sciences / Edition 4

$352.75 Current price is , Original price is $352.75. You
$352.75 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores
$158.42 
  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.

    • Condition: Good
    Note: Access code and/or supplemental material are not guaranteed to be included with used textbook.

Overview

This well-respected text is designed for the first course in probability and statistics taken by students majoring in Engineering and the Computing Sciences. The prerequisite is one year of calculus. The text offers a balanced presentation of applications and theory. The authors take care to develop the theoretical foundations for the statistical methods presented at a level that is accessible to students with only a calculus background. They explore the practical implications of the formal results to problem-solving so students gain an understanding of the logic behind the techniques as well as practice in using them. The examples, exercises, and applications were chosen specifically for students in engineering and computer science and include opportunities for real data analysis.

Product Details

ISBN-13: 9780072468366
Publisher: McGraw Hill LLC
Publication date: 09/30/2002
Edition description: List
Pages: 816
Product dimensions: 6.60(w) x 9.50(h) x 1.40(d)

About the Author

J.Susan Milton is professor Emeritus of Satatics at Radford University. Dr. Milton recieved the B.S. degree from Western Carolina University, the M.A. degree from the University of North Carolina at Chapel Hill, and the Ph.D degree in Statistics from Virginia Polytechnic Institute and State university. She is a Danforth Associate and is a recipient of the Radford University Foundation Award for Excellence in Teaching. Dr. Milton is the author of Statistical Methods in the Biological and Health Sciences as well as Introduction to statistics, Probability with the Essential Analysis, and a first Course in the Theory of Linear Statistical Models.

Jesse C. arnold is a Professor of Statistics at Virginia Polytechnic Insitute and state University. Dr. arnold received the B.S. Degree from Southeastern state University, and the M.A and Ph. D degrees in statistics from Florida state university. He served as head of Statistics department for ten years, is a fellow of the American Statistical Association, and elected member of the International Statistics Institute. Ha has served as President of the International Biometric Society (Eastern North American Region) and Chairman of the statistical Educational Section of the American Statistical Association.

Table of Contents

1 Introduction to Probability and Counting
1.1 Interpreting Probabilities
1.2 Sample Spaces and Events
1.3 Permutations and Combinations

2 Some Probability Laws
2.1 Axioms of Probability
2.2 Conditional Probability
2.3 Independence and the Multiplication Rule
2.4 Bayes' Theorem

3 Discrete Distributions
3.1 Random Variables
3.2 Discrete Probablility Densities
3.3 Expectation and Distribution Parameters
3.4 Geometric Distribution and the Moment Generating Function
3.5 Binomial Distribution
3.6 Negative Binomial Distribution
3.7 Hypergeometric Distribution
3.8 Poisson Distribution

4 Continuous Distributions
4.1 Continuous Densities
4.2 Expectation and Distribution Parameters
4.3 Gamma Distribution
4.4 Normal Distribution
4.5 Normal Probability Rule and Chebyshev's Inequality
4.6 Normal Approximation to the Binomial Distribution
4.7 Weibull Distribution and Reliability
4.8 Transformation of Variables
4.9 Simulating a Continuous Distribution

5 Joint Distributions
5.1 Joint Densities and Independence
5.2 Expectation and Covariance
5.3 Correlation
5.4 Conditional Densities and Regression
5.5 Transformation of Variables

6 Descriptive Statistics
6.1 Random Sampling
6.2 Picturing the Distribution
6.3 Sample Statistics
6.4 Boxplots

7 Estimation
7.1 Point Estimation
7.2 The Method of Moments and Maximum Likelihood
7.3 Functions of Random Variables--Distribution of X
7.4 Interval Estimation and the Central Limit Theorem

8 Inferences on the Mean and Variance of a Distribution
8.1 Interval Estimation of Variability
8.2 Estimating the Mean and the Student-t Distribution
8.3 Hypothesis Testing
8.4 Significance Testing
8.5 Hypothesis and Significance Tests on the Mean
8.6 Hypothesis Tests
8.7 Alternative Nonparametric Methods

9 Inferences on Proportions
9.1 Estimating Proportions
9.2 Testing Hypothesis on a Proportion
9.3 Comparing Two Proportions: Estimation
9.4 Coparing Two Proportions: Hypothesis Testing

10 Comparing Two Means and Two Variances
10.1 Point Estimation
10.2 Comparing Variances: The F Distribution
10.3 Comparing Means: Variances Equal (Pooled Test)
10.4 Comparing Means: Variances Unequal
10.5 Compairing Means: Paried Data
10.6 Alternative Nonparametric Methods
10.7 A Note on Technology

11 Sample Linear Regression and Correlation
11.1 Model and Parameter Estimation
11.2 Properties of Least-Squares Estimators
11.3 Confidence Interval Estimation and Hypothesis Testing
11.4 Repeated Measurements and Lack of Fit
11.5 Residual Analysis
11.6 Correlation

12 Multiple Linear Regression Models
12.1 Least-Squares Procedures for Model Fitting
12.2 A Matrix Approach to Least Squares
12.3 Properties of the Least-Squares Estimators
12.4 Interval Estimation
12.5 Testing Hypotheses about Model Parameters
12.6 Use of Indicator or "Dummy" Variables
12.7 Criteria for Variable Selection
12.8 Model Transformation and Concluding Remarks

13 Analysis of Variance
13.1 One-Way Classification Fixed-Effects Model
13.2 Comparing Variances
13.3 Pairwise Comparison
13.4 Testing Contrasts
13.5 Randomized Complete Block Design
13.6 Latin Squares
13.7 Random-Effects Models
13.8 Design Models in Matrix Form
13.9 Alternative Nonparametric Methods

14 Factorial Experiments
14.1 Two-Factor Analysis of Variance
14.2 Extension to Three Factors
14.3 Random and Mixed Model Factorial Experiments
14.4 2^k Factorial Experiments
14.5 2^k Factorial Experiments in an Incomplete Block Design
14.6 Fractional Factorial Experiments

15 Categorical Data
15.1 Multinomial Distribution
15.2 Chi-Squared Goodness of Fit Tests
15.3 Testing for Independence
15.4 Comparing Proportions

16 Statistical Quality Control
16.1 Properties of Control Charts
16.2 Shewart Control Charts for Measurements
16.3 Shewart Control Charts for Attributes
16.4 Tolerance Limits
16.5 Acceptance Sampling
16.6 Two-Stage Acceptance Sampling
16.7 Extensions in Quality Control
Appendix A Statistical Tables
Appendix B Answers to Selected Problems
Appendix C Selected Derivations
From the B&N Reads Blog

Customer Reviews