Modern Real Estate Practice In Illinois 6th Edition / Edition 6

Modern Real Estate Practice In Illinois 6th Edition / Edition 6

by Fillmore w. Galaty, Robert C Kyle, Laurie Macdougal, Wellington J Allaway

ISBN-10: 1427768331

ISBN-13: 9781427768339

Pub. Date: 06/11/2008

Publisher: Kaplan Publishing

Clearly balancing theory with applications, this book describes both the conventional and less common uses of linear regression in the practical context of today's mathematical and scientific research. Beginning with a general introduction to regression modeling, including typical applications, the book then outlines a host of technical tools that form the linear


Clearly balancing theory with applications, this book describes both the conventional and less common uses of linear regression in the practical context of today's mathematical and scientific research. Beginning with a general introduction to regression modeling, including typical applications, the book then outlines a host of technical tools that form the linear regression analytical arsenal, including: basic inference procedures and introductory aspects of model adequacy checking; how transformations and weighted least squares can be used to resolve problems of model inadequacy; how to deal with influential observations; and polynomial regression models and their variations. The book also includes material on regression models with autocorrelated errors, bootstrapping regression estimates, classification and regression trees, and regression model validation.

Product Details

Kaplan Publishing
Publication date:
Edition description:
Older Edition
Product dimensions:
8.50(w) x 10.90(h) x 1.40(d)

Table of Contents

1.1Regression and Model Building1
1.2Data Collection7
1.3Uses of Regression11
1.4Role of the Computer12
2.Simple Linear Regression13
2.1Simple Linear Regression Model13
2.2Least-Squares Estimation of the Parameters14
2.2.1Estimation of [beta subscript 0] and [beta subscript 1]14
2.2.2Properties of the Least-Squares Estimators and the Fitted Regression Model20
2.2.3Estimation of [sigma superscript 2]22
2.2.4An Alternate Form of the Model24
2.3Hypothesis Testing on the Slope and Intercept24
2.3.1Use of t-Tests25
2.3.2Testing Significance of Regression26
2.3.3The Analysis of Variance28
2.4Interval Estimation in Simple Linear Regression32
2.4.1Confidence Intervals on [beta subscript 0], [beta subscript 1], and [sigma superscript 2]32
2.4.2Interval Estimation of the Mean Response34
2.5Prediction of New Observations37
2.6Coefficient of Determination39
2.7Some Considerations in the Use of Regression41
2.8Regression Through the Origin44
2.9Estimation by Maximum Likelihood50
2.10Case Where the Regressor x is Random52
2.10.1x and y Jointly Distributed52
2.10.2x and y Jointly Normally Distributed: The Correlation Model53
3.Multiple Linear Regression67
3.1Multiple Regression Models67
3.2Estimation of the Model Parameters71
3.2.1Least-Squares Estimation of the Regression Coefficients71
3.2.2A Geometrical Interpretation of Least Squares81
3.2.3Properties of the Least-Squares Estimators82
3.2.4Estimation of [sigma superscript 2]82
3.2.5Inadequacy of Scatter Diagrams in Multiple Regression84
3.2.6Maximum-Likelihood Estimation85
3.3Hypothesis Testing in Multiple Linear Regression87
3.3.1Test for Significance of Regression87
3.3.2Tests on Individual Regression Coefficients91
3.3.3Special Case of Orthogonal Columns in X96
3.3.4Testing the General Linear Hypothesis98
3.4Confidence Intervals in Multiple Regression101
3.4.1Confidence Intervals on the Regression Coefficients102
3.4.2Confidence Interval Estimation of the Mean Response103
3.4.3Simultaneous Confidence Intervals on Regression Coefficients104
3.5Prediction of New Observations108
3.6Hidden Extrapolation in Multiple Regression109
3.7Standardized Regression Coefficients112
3.9Why Do Regression Coefficients have the Wrong Sign?120
4.Model Adequacy Checking131
4.2Residual Analysis132
4.2.1Definition of Residuals132
4.2.2Methods for Scaling Residuals132
4.2.3Residual Plots138
4.2.4Partial Regression and Partial Residual Plots146
4.2.5Other Residual Plotting and Analysis Methods150
4.3The PRESS Statistic152
4.4Detection and Treatment of Outliers154
4.5Lack of Fit of the Regression Model158
4.5.1A Formal Test for Lack of Fit158
4.5.2Estimation of Pure Error from Near-Neighbors162
5.Transformations and Weighting to Correct Model Inadequacies173
5.2Variance-Stabilizing Transformations174
5.3Transformations to Linearize the Model178
5.4Analytical Methods for Selecting a Transformation186
5.4.1Transformations on y: The Box-Cox Method186
5.4.2Transformations on the Regressor Variables189
5.5Generalized and Weighted Least Squares193
5.5.1Generalized Least Squares193
5.5.2Weighted Least Squares195
5.5.3Some Practical Issues196
6.Diagnostics for Leverage and Influence207
6.1Importance of Detecting Influential Observations207
6.3Measures of Influence: Cook's D210
6.4Measures of Influence: DFFITS and DFBETAS213
6.5A Measure of Model Performance216
6.6Detecting Groups of Influential Observations217
6.7Treatment of Influential Observations218
7.Polynomial Regression Models221
7.2Polynomial Models in One Variable221
7.2.1Basic Principles221
7.2.2Piecewise Polynomial Fitting (Splines)228
7.2.3Polynomial and Trigonometric Terms236
7.3Nonparametric Regression237
7.3.1Kernel Regression238
7.3.2Locally Weighted Regression (Loess)239
7.3.3Final Cautions243
7.4Polynomial Models in Two or More Variables244
7.5Orthogonal Polynomials253
8.Indicator Variables265
8.1The General Concept of Indicator Variables265
8.2Comments on the Use of Indicator Variables279
8.2.1Indicator Variables versus Regression on Allocated Codes279
8.2.2Indicator Variables as a Substitute for a Quantitative Regressor280
8.3Regression Approach to Analysis of Variance281
9.Variable Selection and Model Building291
9.1.1The Model-Building Problem291
9.1.2Consequences of Model Misspecification292
9.1.3Criteria for Evaluating Subset Regression Models296
9.2Computational Techniques for Variable Selection302
9.2.1All Possible Regressions302
9.2.2Stepwise Regression Methods310
9.3Some Final Recommendations for Practice317
10.2Sources of Multicollinearity325
10.3Effects of Multicollinearity328
10.4Multicollinearity Diagnostics334
10.4.1Examination of the Correlation Matrix334
10.4.2Variance Inflation Factors337
10.4.3Eigensystem Analysis of X'X339
10.4.4Other Diagnostics343
10.5Methods for Dealing with Multicollinearity345
10.5.1Collecting Additional Data345
10.5.2Model Respecification346
10.5.3Ridge Regression348
10.5.4Other Methods363
10.5.5Comparison and Evaluation of Biased Estimators375
11.Robust Regression382
11.1The Need for Robust Regression382
11.3Properties of Robust Estimators400
11.3.1Breakdown Point400
11.4Survey of Other Robust Regression Estimators401
11.4.1High-Breakdown-Point Estimators401
11.4.2Bounded Influence Estimators406
11.4.3Other Procedures407
11.4.4Computing Robust Regression Estimators409
12.Introduction to Nonlinear Regression414
12.1Linear and Nonlinear Regression Models414
12.1.1Linear Regression Models414
12.1.2Nonlinear Regression Models415
12.2Nonlinear Least Squares416
12.3Transformation to a Linear Model420
12.4Parameter Estimation in a Nonlinear System423
12.4.2Other Parameter Estimation Methods431
12.4.3Starting Values432
12.4.4Computer Programs433
12.5Statistical Inference in Nonlinear Regression434
12.6Examples of Nonlinear Regression Models437
13.Generalized Linear Models443
13.2Logistic Regression Models444
13.2.1Models with a Binary Response Variable444
13.2.2Estimating the Parameters in a Logistic Regression Model447
13.2.3Interpretation of the Parameters in a Logistic Regression Model450
13.2.4Hypothesis Tests on Model Param6ters453
13.3Poisson Regression459
13.4The Generalized Linear Model466
13.4.1Link Functions and Linear Predictors467
13.4.2Parameter Estimation and Inference in the GLM468
13.4.3Prediction and Estimation with the GLM472
13.4.4Residual Analysis in the GLM474
14.Other Topics in the Use of Regression Analysis488
14.1Regression Models with Autocorrelation Errors488
14.1.1Source and Effects of Autocorrelation488
14.1.2Detecting the Presence of Autocorrelation489
14.1.3Parameter Estimation Methods494
14.2Effect of Measurement Errors in the Regressors500
14.2.1Simple Linear Regression501
14.2.2The Berkson Model502
14.3Inverse Estimation--The Calibration Problem503
14.4Bootstrapping in Regression508
14.4.1Bootstrap Sampling in Regression509
14.4.2Bootstrap Confidence Intervals510
14.5Classification and Regression Trees (CART)516
14.6Neural Networks518
14.7Designed Experiments for Regression521
15.Validation of Regression Models529
15.2Validation Techniques530
15.2.1Analysis of Model Coefficients and Predicted Values530
15.2.2Collecting Fresh Data--Confirmation Runs532
15.2.3Data Splitting534
15.3Data from Planned Experiments545
Appendix A.Statistical Tables549
Appendix B.Data Sets For Exercises567
Appendix C.Supplemental Technical Material582
C.1Background on Basic Test Statistics582
C.2Background from the Theory of Linear Models585
C.3Important Results on SS[subscript R] and SS[subscript Res]588
C.4The Gauss-Markov Theorem, Var([varepsilon]) = [sigma superscript 2]I594
C.5Computational Aspects of Multiple Regression595
C.6A Result on the Inverse of a Matrix597
C.7Development of the PRESS Statistic598
C.8Development of S[superscript 2 subscript (i)]600
C.9An Outlier Test Based on R-Student601
C.10The Gauss-Markov Theorem, Var([varepsilon]) = V604
C.11The Bias in MS[subscript Res] When the Model is Underspecified606
C.12Computation of Influence Diagnostics608
C.13Generalized Linear Models610

Customer Reviews

Average Review:

Write a Review

and post it to your social network


Most Helpful Customer Reviews

See all customer reviews >