|
|
|
xiii | (4) |
|
|
|
xvii | |
|
0 Basic Prerequisite Knowledge |
|
|
1 | (14) |
|
0.1 Distributions: Normal, t, and F |
|
|
1 | (3) |
|
0.2 Confidence Intervals (or Bands) and t-Tests |
|
|
4 | (2) |
|
0.3 Elements of Matrix Algebra |
|
|
6 | (9) |
|
1 Fitting a Straight Line by Least Squares |
|
|
15 | (32) |
|
1.0 Introduction: The Need for Statistical Analysis |
|
|
15 | (3) |
|
1.1 Straight Line Relationship Between Two Variables |
|
|
18 | (2) |
|
1.2 Linear Regression: Fitting a Straight Line by Least Squares |
|
|
20 | (8) |
|
1.3 The Analysis of Variance |
|
|
28 | (6) |
|
1.4 Confidence Intervals and Tests for beta(0) and beta(1) |
|
|
34 | (4) |
|
1.5 F-Test for Significance of Regression |
|
|
38 | (2) |
|
1.6 The Correlation Between X and Y |
|
|
40 | (4) |
|
1.7 Summary of the Straight Line Fit Computations |
|
|
44 | (1) |
|
|
|
45 | (1) |
|
Appendix 1A Steam Plant Data |
|
|
46 | (1) |
|
Exercises are in "Exercises for Chapters 1-3" |
|
|
96 | |
|
2 Checking the Straight Line Fit |
|
|
47 | (32) |
|
2.1 Lack of Fit and Pure Error |
|
|
47 | (9) |
|
2.2 Testing Homogeneity of Pure Error |
|
|
56 | (3) |
|
2.3 Examining Residuals: The Basic Plots |
|
|
59 | (2) |
|
2.4 Non-normality Checks on Residuals |
|
|
61 | (1) |
|
2.5 Checks for Time Effects, Nonconstant Variance, Need for Transformation, and Curvature |
|
|
62 | (5) |
|
2.6 Other Residuals Plots |
|
|
67 | (2) |
|
|
|
69 | (1) |
|
2.8 Reference Books for Analysis of Residuals |
|
|
70 | (1) |
|
|
|
70 | (6) |
|
Appendix 2B MINITAB Instructions |
|
|
76 | (3) |
|
Exercises are in "Exercises for Chapters 1-3" |
|
|
96 | |
|
3 Fitting Straight Lines: Special Topics |
|
|
79 | (36) |
|
3.0 Summary and Preliminaries |
|
|
79 | (1) |
|
|
|
80 | (3) |
|
3.2 Inverse Regression (Straight Line Case) |
|
|
83 | (3) |
|
3.3 Some Practical Design of Experiment Implications of Regression |
|
|
86 | (3) |
|
3.4 Straight Line Regression When Both Variables Are Subject to Error |
|
|
89 | (7) |
|
Exercises for Chapters 1-3 |
|
|
96 | (19) |
|
4 Regression in Matrix Terms: Straight Line Case |
|
|
115 | (20) |
|
4.1 Fitting a Straight Line in Matrix Terms |
|
|
115 | (10) |
|
4.2 Singularity: What Happens in Regression to Make X'X Singular? An Example |
|
|
125 | (2) |
|
4.3 The Analysis of Variance in Matrix Terms |
|
|
127 | (1) |
|
4.4 The Variances and Covariance of b(0) and b(1) from the Matrix Calculation |
|
|
128 | (2) |
|
4.5 Variance of Y Using the Matrix Development |
|
|
130 | (1) |
|
4.6 Summary of Matrix Approach to Fitting a Straight Line (Nonsingular Case) |
|
|
130 | (1) |
|
4.7 The General Regression Situation |
|
|
131 | (1) |
|
|
|
132 | (3) |
|
5 The General Regression Situation |
|
|
135 | (14) |
|
5.1 General Linear Regression |
|
|
135 | (2) |
|
5.2 Least Squares Properties |
|
|
137 | (3) |
|
5.3 Least Squares Properties When sigma ~ N(0,I0(2) |
|
|
140 | (2) |
|
5.4 Confidence Intervals Versus Regions |
|
|
142 | (1) |
|
5.5 More on Confidence Intervals Versus Regions |
|
|
143 | (4) |
|
Appendix 5A Selected Useful Matrix Results |
|
|
147 | (2) |
|
Exercises are in "Exercises for Chapters 5 and 6" |
|
|
169 | |
|
6 Extra Sums of squares and Tests for Several Parameters Being Zero |
|
|
149 | (30) |
|
6.1 The "Extra Sum of Squares" Principle |
|
|
149 | (5) |
|
6.2 Two Predictor Variables: Example |
|
|
154 | (8) |
|
6.3 Sum of Squares of a Set of Linear Functions of Y's |
|
|
162 | (3) |
|
Appendix 6A Orthogonal Columns in the X Matrix |
|
|
165 | (2) |
|
Appendix 6B Two Predictors: Sequential Sums of Squares |
|
|
167 | (2) |
|
Exercises for Chapters 5 and 6 |
|
|
169 | (10) |
|
7 Serial Correlation in the Residuals and the Durbin-Watson Test |
|
|
179 | (26) |
|
7.1 Serial Correlation in Residuals |
|
|
179 | (2) |
|
7.2 The Durbin-Watson Test for a Certain Type of Serial Correlation |
|
|
181 | (11) |
|
7.3 Examining Runs in the Time Sequence Plot of Residuals: Runs Test |
|
|
192 | (6) |
|
|
|
198 | (7) |
|
8 More on Checking Fitted Models |
|
|
205 | (12) |
|
8.1 The Hat Matrix H and the Various Types of Residuals |
|
|
205 | (4) |
|
8.2 Added Variable Plot and Partial Residuals |
|
|
209 | (1) |
|
8.3 Detection of Influential Observations: Cook's Statistics |
|
|
210 | (4) |
|
8.4 Other Statistics Measuring Influence |
|
|
214 | (1) |
|
8.5 Reference Books for Analysis of Residuals |
|
|
214 | (1) |
|
|
|
215 | (2) |
|
9 Multiple Regression: Special Topics |
|
|
217 | (18) |
|
9.1 Testing a General Linear Hypothesis |
|
|
217 | (4) |
|
9.2 Generalized Least Squares and Weighted Least Squares |
|
|
221 | (3) |
|
9.3 An Example of Weighted Least Squares |
|
|
224 | (2) |
|
9.4 A Numerical Example of Weighted Least Squares |
|
|
226 | (3) |
|
9.5 Restricted Least Squares |
|
|
229 | (1) |
|
9.6 Inverse Regression (Multiple Predictor Case) |
|
|
229 | (2) |
|
9.7 Planar Regression When All the Variables Are Subject to Error |
|
|
231 | (1) |
|
Appendix 9A Lagrange's Undetermined Multipliers |
|
|
231 | (2) |
|
|
|
233 | (2) |
|
10 Bias in Regression Estimates, and Expected Values of Mean Squares and Sums of Squares |
|
|
235 | (8) |
|
10.1 Bias in Regression Estimates |
|
|
235 | (3) |
|
10.2 The Effect of Bias on the Least Squares Analysis of Variance |
|
|
238 | (1) |
|
10.3 Finding the Expected Values of Mean Squares |
|
|
239 | (1) |
|
10.4 Expected Value of Extra Sum of Squares |
|
|
240 | (1) |
|
|
|
241 | (2) |
|
11 On Worthwhile Regressions, Big F's, and R(2) |
|
|
243 | (8) |
|
11.1 Is My Regression a Useful One? |
|
|
243 | (2) |
|
11.2 A Conversion About R(2) |
|
|
245 | (2) |
|
Appendix 11A How Significant Should My Regression Be? |
|
|
247 | (3) |
|
|
|
250 | (1) |
|
12 Models Containing Functions of the Predictors, Including Polynomial Models |
|
|
251 | (26) |
|
12.1 More Complicated Model Functions |
|
|
251 | (3) |
|
12.2 Worked Examples of Second-Order Surface Fitting for k = 3 and k = 2 Predictor Variables |
|
|
254 | (12) |
|
12.3 Retaining Terms in Polynomial Models |
|
|
266 | (6) |
|
|
|
272 | (5) |
|
13 Transformation of the Response Variable |
|
|
277 | (22) |
|
13.1 Introduction and Preliminary Remarks |
|
|
277 | (3) |
|
13.2 Power Family of Transformations on the Response: Box-Cox Method |
|
|
280 | (6) |
|
13.3 A Second Method for Estimation Lambda |
|
|
286 | (3) |
|
13.4 Response Transformations: Other Interesting and Sometimes Useful Plots |
|
|
289 | (1) |
|
13.5 Other Types of Response Transformations |
|
|
290 | (1) |
|
13.6 Response Transformations Chosen to Stabilize Variance |
|
|
291 | (3) |
|
|
|
294 | (5) |
|
|
|
299 | (28) |
|
14.1 Dummy Variables to Separate Blocks of Data with Different Intercepts, same Model |
|
|
299 | (8) |
|
14.2 Interaction Terms Involving Dummy Variables |
|
|
307 | (4) |
|
14.3 Dummy Variables for Segmented Models |
|
|
311 | (6) |
|
|
|
317 | (10) |
|
15 Selecting the "Best" Regression Equation |
|
|
327 | (42) |
|
|
|
327 | (2) |
|
15.1 All Possible Regressions and "Best Subset" Regression |
|
|
329 | (6) |
|
|
|
335 | (4) |
|
15.3 Backward Elimination |
|
|
339 | (3) |
|
15.4 Significance Levels for Selection Procedures |
|
|
342 | (1) |
|
15.5 Variations and Summary |
|
|
343 | (2) |
|
15.6 Selection Procedures Applied to the Steam Data |
|
|
345 | (3) |
|
Appendix 15A Hald Data, Correlation Matrix, and All 15 Possible Regressions |
|
|
348 | (7) |
|
|
|
355 | (14) |
|
16 III-Conditioning in Regression Data |
|
|
369 | (18) |
|
|
|
369 | (2) |
|
16.2 Centering Regression Data |
|
|
371 | (2) |
|
16.3 Centering and Scaling Regression Data |
|
|
373 | (2) |
|
16.4 Measuring Multicollinearity |
|
|
375 | (1) |
|
16.5 Belsley's Suggestion for Detecting Multicollinearity |
|
|
376 | (6) |
|
Appendix 16A Transforming X Matrices to Obtain Orthogonal Columns |
|
|
382 | (3) |
|
|
|
385 | (2) |
|
|
|
387 | (14) |
|
|
|
387 | (1) |
|
17.2 Basic Form of Ridge Regression |
|
|
387 | (2) |
|
17.3 Ridge Regression of the Hald Data |
|
|
389 | (2) |
|
17.4 In What Circumstances Is Ridge Regression Absolutely the Correct Way to Proceed? |
|
|
391 | (3) |
|
17.5 The Phoney Data Viewpoint |
|
|
394 | (1) |
|
|
|
395 | (1) |
|
Appendix 17A Ridge Estimates in Terms of Least Squares Estimates |
|
|
396 | (1) |
|
Appendix 17B Mean Square Error Argument |
|
|
396 | (1) |
|
Appendix 17C Canonical Form of Ridge Regression |
|
|
397 | (3) |
|
|
|
400 | (1) |
|
18 Generalized Linear Models (GLIM) |
|
|
401 | (8) |
|
|
|
401 | (1) |
|
18.2 The Exponential Family of Distributions |
|
|
402 | (2) |
|
18.3 Fitting Generalized Linear Models (GLIM) |
|
|
404 | (2) |
|
18.4 Performing the Calculations: An Example |
|
|
406 | (2) |
|
|
|
408 | (1) |
|
|
|
408 | (1) |
|
19 Mixture Ingredients as Predictor Variables |
|
|
409 | (18) |
|
19.1 Mixture Experiments: Experimental Spaces |
|
|
409 | (3) |
|
19.2 Models for Mixture Experiments |
|
|
412 | (4) |
|
19.3 Mixture Experiments in Restricted Regions |
|
|
416 | (2) |
|
|
|
418 | (1) |
|
|
|
419 | (3) |
|
Appendix 19A Transforming k Mixture Variables to k - 1 Working Variables |
|
|
422 | (3) |
|
|
|
425 | (2) |
|
20 The Geometry of Least Squares |
|
|
427 | (20) |
|
|
|
427 | (2) |
|
20.2 Pythagoras and Analysis of Variance |
|
|
429 | (3) |
|
20.3 Analysis of Variance and F-Test for Overall Regression |
|
|
432 | (1) |
|
20.4 The Singular X'X Case: An Example |
|
|
433 | (2) |
|
20.5 Orthogonalizing in the General Regression Case |
|
|
435 | (2) |
|
20.6 Range Space and Null Space of a Matrix M |
|
|
437 | (2) |
|
20.7 The Algebra and Geometry of Pure Error |
|
|
439 | (2) |
|
Appendix 20A Generalized Inverses M- |
|
|
441 | (3) |
|
|
|
444 | (3) |
|
21 More/Geometry of Least Squares |
|
|
447 | (14) |
|
21.1 The Geometry of Null Hypothesis: A Simple Example |
|
|
447 | (1) |
|
21.2 General Case H(0): A beta = c: The Projection Algebra |
|
|
448 | (1) |
|
21.3 General Illustrations |
|
|
449 | (1) |
|
21.4 The F-Test for H(0), Geometrically |
|
|
450 | (2) |
|
21.5 The Geometry of R(2) |
|
|
452 | (1) |
|
21.6 Change in R(2) for Models Nested Via A beta = 0 Not Involving beta(0) |
|
|
452 | (2) |
|
21.7 Multiple Regression with Two Predictor Variables as a Sequence of Straight Line Regressions |
|
|
454 | (5) |
|
|
|
459 | (2) |
|
22 Orthogonal Polynomials and Summary Data |
|
|
461 | (12) |
|
|
|
461 | (1) |
|
22.2 Orthogonal Polynomials |
|
|
461 | (6) |
|
22.3 Regression Analysis of summary Data |
|
|
467 | (2) |
|
|
|
469 | (4) |
|
23 Multiple Regression Applied to Analysis of Variance Problems |
|
|
473 | (32) |
|
|
|
473 | (1) |
|
23.2 The One-Way Classification: Standard Analysis and an Example |
|
|
474 | (3) |
|
23.3 Regression Treatment of the One-Way Classification Example |
|
|
477 | (4) |
|
23.4 Regression Treatment of the One-Way Classification Using the Original Model |
|
|
481 | (5) |
|
23.5 Regression Treatment of the One-Way Classification: Independent Normal Equations |
|
|
486 | (2) |
|
23.6 The Two-Way Classification with Equal Numbers of Observations in the Cells: An Example |
|
|
488 | (1) |
|
23.7 Regression Treatment of the Two-Way Classification Example |
|
|
489 | (4) |
|
23.8 The Two-Way Classification with Equal Numbers of Observations in the Cells |
|
|
493 | (1) |
|
23.9 Regression Treatment of the Two-Way Classification with Equal Numbers of Observations in the Cells |
|
|
494 | (4) |
|
23.10 Example: The Two-Way Classification |
|
|
498 | (1) |
|
23.11 Recapitulation and Comments |
|
|
499 | (1) |
|
|
|
500 | (5) |
|
24 An Introduction to Nonlinear Estimation |
|
|
505 | (62) |
|
24.1 Least Squares for Nonlinear Models |
|
|
505 | (3) |
|
24.2 Estimating the Parameters of a Nonlinear System |
|
|
508 | (10) |
|
|
|
518 | (11) |
|
24.4 A Note on Reparameterization of the Model |
|
|
529 | (1) |
|
24.5 The Geometry of Linear Least Squares |
|
|
530 | (9) |
|
24.6 The Geometry of Nonlinear Least Squares |
|
|
539 | (4) |
|
24.7 Nonlinear Growth Models |
|
|
543 | (7) |
|
24.8 Nonlinear Models: Other Work |
|
|
550 | (3) |
|
|
|
553 | (1) |
|
|
|
553 | (14) |
|
|
|
567 | (18) |
|
25.1 Least Absolute Deviations Regression (L(1)Regression) |
|
|
567 | (1) |
|
|
|
567 | (6) |
|
25.3 Steel Employment Example |
|
|
573 | (2) |
|
|
|
575 | (2) |
|
25.5 Least Median of Squares (LMS) Regression |
|
|
577 | (1) |
|
25.6 Robust Regression with Ranked Residuals (rreg) |
|
|
577 | (3) |
|
|
|
580 | (1) |
|
25.8 Comments and Opinions |
|
|
580 | (1) |
|
|
|
581 | (3) |
|
|
|
584 | (1) |
|
26 Resampling Procedures (Bootstrapping) |
|
|
585 | (8) |
|
26.1 Resampling Procedures for Regression Models |
|
|
585 | (1) |
|
26.2 Example: Straight Line Fit |
|
|
586 | (2) |
|
26.3 Example: Planar Fit, Three Predictors |
|
|
588 | (1) |
|
|
|
588 | (1) |
|
Appendix 26A Sample MINITAB Programs to Bootstrap Residuals for a Specific Example |
|
|
589 | (1) |
|
Appendix 26B Sample MINITAB Programs to Bootstrap Pairs for a Specific Example |
|
|
590 | (1) |
|
|
|
591 | (1) |
|
|
|
591 | (2) |
| Bibliography |
|
593 | (12) |
| True/False Questions |
|
605 | (4) |
| Answers to Exercises |
|
609 | (75) |
|
|
|
684 | (11) |
|
|
|
684 | (2) |
|
Percentage Points of the t-Distribution |
|
|
686 | (1) |
|
Percentage Points of the X(2)-Distribution |
|
|
687 | (1) |
|
Percentage Points of the F-Distribution |
|
|
688 | (7) |
| Index of Authors Associated with Exercises |
|
695 | (2) |
| Index |
|
697 | |