Applied Regression Analysis

by ;
Edition: 3rd
Format: Hardcover
Pub. Date: 1998-04-23
Publisher(s): Wiley-Interscience
List Price: $242.25

Buy New

Usually Ships in 8 - 10 Business Days.
$230.71

Rent Textbook

Select for Price
There was a problem. Please try again later.

Rent Digital

Rent Digital Options
Online:1825 Days access
Downloadable:Lifetime Access
$211.20
*To support the delivery of the digital material to you, a non-refundable digital delivery fee of $3.99 will be charged on each digital item.
$211.20*

Used Textbook

We're Sorry
Sold Out

How Marketplace Works:

  • This item is offered by an independent seller and not shipped from our warehouse
  • Item details like edition and cover design may differ from our description; see seller's comments before ordering.
  • Sellers much confirm and ship within two business days; otherwise, the order will be cancelled and refunded.
  • Marketplace purchases cannot be returned to eCampus.com. Contact the seller directly for inquiries; if no response within two days, contact customer service.
  • Additional shipping costs apply to Marketplace purchases. Review shipping costs at checkout.

Summary

An outstanding introduction to the fundamentals of regression analysis-updated and expanded The methods of regression analysis are the most widely used statistical tools for discovering the relationships among variables. This classic text, with its emphasis on clear, thorough presentation of concepts and applications, offers a complete, easily accessible introduction to the fundamentals of regression analysis. Assuming only a basic knowledge of elementary statistics, Applied Regression Analysis, Third Edition focuses on the fitting and checking of both linear and nonlinear regression models, using small and large data sets, with pocket calculators or computers. This Third Edition features separate chapters on multicollinearity, generalized linear models, mixture ingredients, geometry of regression, robust regression, and resampling procedures. Extensive support materials include sets of carefully designed exercises with full or partial solutions and a series of true/false questions with answers. All data sets used in both the text and the exercises can be found on the companion disk at the back of the book. For analysts, researchers, and students in university, industrial, and government courses on regression, this text is an excellent introduction to the subject and an efficient means of learning how to use a valuable analytical tool. It will also prove an invaluable reference resource for applied scientists and statisticians.

Author Biography

NORMAN R. DRAPER teaches in the Department of Statistics at the University of Wisconsin. HARRY SMITH is a former faculty member of the Mt. Sinai School of Medicine.

Table of Contents

Preface
xiii(4)
About the Software
xvii
0 Basic Prerequisite Knowledge
1(14)
0.1 Distributions: Normal, t, and F
1(3)
0.2 Confidence Intervals (or Bands) and t-Tests
4(2)
0.3 Elements of Matrix Algebra
6(9)
1 Fitting a Straight Line by Least Squares
15(32)
1.0 Introduction: The Need for Statistical Analysis
15(3)
1.1 Straight Line Relationship Between Two Variables
18(2)
1.2 Linear Regression: Fitting a Straight Line by Least Squares
20(8)
1.3 The Analysis of Variance
28(6)
1.4 Confidence Intervals and Tests for beta(0) and beta(1)
34(4)
1.5 F-Test for Significance of Regression
38(2)
1.6 The Correlation Between X and Y
40(4)
1.7 Summary of the Straight Line Fit Computations
44(1)
1.8 Historical Remarks
45(1)
Appendix 1A Steam Plant Data
46(1)
Exercises are in "Exercises for Chapters 1-3"
96
2 Checking the Straight Line Fit
47(32)
2.1 Lack of Fit and Pure Error
47(9)
2.2 Testing Homogeneity of Pure Error
56(3)
2.3 Examining Residuals: The Basic Plots
59(2)
2.4 Non-normality Checks on Residuals
61(1)
2.5 Checks for Time Effects, Nonconstant Variance, Need for Transformation, and Curvature
62(5)
2.6 Other Residuals Plots
67(2)
2.7 Durbin-Watson Test
69(1)
2.8 Reference Books for Analysis of Residuals
70(1)
Appendix 2A Normal Plots
70(6)
Appendix 2B MINITAB Instructions
76(3)
Exercises are in "Exercises for Chapters 1-3"
96
3 Fitting Straight Lines: Special Topics
79(36)
3.0 Summary and Preliminaries
79(1)
3.1 Standard Error of Y
80(3)
3.2 Inverse Regression (Straight Line Case)
83(3)
3.3 Some Practical Design of Experiment Implications of Regression
86(3)
3.4 Straight Line Regression When Both Variables Are Subject to Error
89(7)
Exercises for Chapters 1-3
96(19)
4 Regression in Matrix Terms: Straight Line Case
115(20)
4.1 Fitting a Straight Line in Matrix Terms
115(10)
4.2 Singularity: What Happens in Regression to Make X'X Singular? An Example
125(2)
4.3 The Analysis of Variance in Matrix Terms
127(1)
4.4 The Variances and Covariance of b(0) and b(1) from the Matrix Calculation
128(2)
4.5 Variance of Y Using the Matrix Development
130(1)
4.6 Summary of Matrix Approach to Fitting a Straight Line (Nonsingular Case)
130(1)
4.7 The General Regression Situation
131(1)
Exercises for Chapter 4
132(3)
5 The General Regression Situation
135(14)
5.1 General Linear Regression
135(2)
5.2 Least Squares Properties
137(3)
5.3 Least Squares Properties When sigma ~ N(0,I0(2)
140(2)
5.4 Confidence Intervals Versus Regions
142(1)
5.5 More on Confidence Intervals Versus Regions
143(4)
Appendix 5A Selected Useful Matrix Results
147(2)
Exercises are in "Exercises for Chapters 5 and 6"
169
6 Extra Sums of squares and Tests for Several Parameters Being Zero
149(30)
6.1 The "Extra Sum of Squares" Principle
149(5)
6.2 Two Predictor Variables: Example
154(8)
6.3 Sum of Squares of a Set of Linear Functions of Y's
162(3)
Appendix 6A Orthogonal Columns in the X Matrix
165(2)
Appendix 6B Two Predictors: Sequential Sums of Squares
167(2)
Exercises for Chapters 5 and 6
169(10)
7 Serial Correlation in the Residuals and the Durbin-Watson Test
179(26)
7.1 Serial Correlation in Residuals
179(2)
7.2 The Durbin-Watson Test for a Certain Type of Serial Correlation
181(11)
7.3 Examining Runs in the Time Sequence Plot of Residuals: Runs Test
192(6)
Exercises for Chapter 7
198(7)
8 More on Checking Fitted Models
205(12)
8.1 The Hat Matrix H and the Various Types of Residuals
205(4)
8.2 Added Variable Plot and Partial Residuals
209(1)
8.3 Detection of Influential Observations: Cook's Statistics
210(4)
8.4 Other Statistics Measuring Influence
214(1)
8.5 Reference Books for Analysis of Residuals
214(1)
Exercises for Chapter 8
215(2)
9 Multiple Regression: Special Topics
217(18)
9.1 Testing a General Linear Hypothesis
217(4)
9.2 Generalized Least Squares and Weighted Least Squares
221(3)
9.3 An Example of Weighted Least Squares
224(2)
9.4 A Numerical Example of Weighted Least Squares
226(3)
9.5 Restricted Least Squares
229(1)
9.6 Inverse Regression (Multiple Predictor Case)
229(2)
9.7 Planar Regression When All the Variables Are Subject to Error
231(1)
Appendix 9A Lagrange's Undetermined Multipliers
231(2)
Exercises for Chapter 9
233(2)
10 Bias in Regression Estimates, and Expected Values of Mean Squares and Sums of Squares
235(8)
10.1 Bias in Regression Estimates
235(3)
10.2 The Effect of Bias on the Least Squares Analysis of Variance
238(1)
10.3 Finding the Expected Values of Mean Squares
239(1)
10.4 Expected Value of Extra Sum of Squares
240(1)
Exercises for Chapter 10
241(2)
11 On Worthwhile Regressions, Big F's, and R(2)
243(8)
11.1 Is My Regression a Useful One?
243(2)
11.2 A Conversion About R(2)
245(2)
Appendix 11A How Significant Should My Regression Be?
247(3)
Exercises for Chapter 11
250(1)
12 Models Containing Functions of the Predictors, Including Polynomial Models
251(26)
12.1 More Complicated Model Functions
251(3)
12.2 Worked Examples of Second-Order Surface Fitting for k = 3 and k = 2 Predictor Variables
254(12)
12.3 Retaining Terms in Polynomial Models
266(6)
Exercises for Chapter 12
272(5)
13 Transformation of the Response Variable
277(22)
13.1 Introduction and Preliminary Remarks
277(3)
13.2 Power Family of Transformations on the Response: Box-Cox Method
280(6)
13.3 A Second Method for Estimation Lambda
286(3)
13.4 Response Transformations: Other Interesting and Sometimes Useful Plots
289(1)
13.5 Other Types of Response Transformations
290(1)
13.6 Response Transformations Chosen to Stabilize Variance
291(3)
Exercises for Chapter 13
294(5)
14 "Dummy" Variables
299(28)
14.1 Dummy Variables to Separate Blocks of Data with Different Intercepts, same Model
299(8)
14.2 Interaction Terms Involving Dummy Variables
307(4)
14.3 Dummy Variables for Segmented Models
311(6)
Exercises for Chapter 14
317(10)
15 Selecting the "Best" Regression Equation
327(42)
15.0 Introduction
327(2)
15.1 All Possible Regressions and "Best Subset" Regression
329(6)
15.2 Stepwise Regression
335(4)
15.3 Backward Elimination
339(3)
15.4 Significance Levels for Selection Procedures
342(1)
15.5 Variations and Summary
343(2)
15.6 Selection Procedures Applied to the Steam Data
345(3)
Appendix 15A Hald Data, Correlation Matrix, and All 15 Possible Regressions
348(7)
Exercises for Chapter 15
355(14)
16 III-Conditioning in Regression Data
369(18)
16.1 Introduction
369(2)
16.2 Centering Regression Data
371(2)
16.3 Centering and Scaling Regression Data
373(2)
16.4 Measuring Multicollinearity
375(1)
16.5 Belsley's Suggestion for Detecting Multicollinearity
376(6)
Appendix 16A Transforming X Matrices to Obtain Orthogonal Columns
382(3)
Exercises for Chapter 16
385(2)
17 Ridge Regression
387(14)
17.1 Introduction
387(1)
17.2 Basic Form of Ridge Regression
387(2)
17.3 Ridge Regression of the Hald Data
389(2)
17.4 In What Circumstances Is Ridge Regression Absolutely the Correct Way to Proceed?
391(3)
17.5 The Phoney Data Viewpoint
394(1)
17.6 Concluding Remarks
395(1)
Appendix 17A Ridge Estimates in Terms of Least Squares Estimates
396(1)
Appendix 17B Mean Square Error Argument
396(1)
Appendix 17C Canonical Form of Ridge Regression
397(3)
Exercises for Chapter 17
400(1)
18 Generalized Linear Models (GLIM)
401(8)
18.1 Introduction
401(1)
18.2 The Exponential Family of Distributions
402(2)
18.3 Fitting Generalized Linear Models (GLIM)
404(2)
18.4 Performing the Calculations: An Example
406(2)
18.5 Further Reading
408(1)
Exercises for Chapter 18
408(1)
19 Mixture Ingredients as Predictor Variables
409(18)
19.1 Mixture Experiments: Experimental Spaces
409(3)
19.2 Models for Mixture Experiments
412(4)
19.3 Mixture Experiments in Restricted Regions
416(2)
19.4 Example 1
418(1)
19.5 Example 2
419(3)
Appendix 19A Transforming k Mixture Variables to k - 1 Working Variables
422(3)
Exercises for Chapter 19
425(2)
20 The Geometry of Least Squares
427(20)
20.1 The Basic Geometry
427(2)
20.2 Pythagoras and Analysis of Variance
429(3)
20.3 Analysis of Variance and F-Test for Overall Regression
432(1)
20.4 The Singular X'X Case: An Example
433(2)
20.5 Orthogonalizing in the General Regression Case
435(2)
20.6 Range Space and Null Space of a Matrix M
437(2)
20.7 The Algebra and Geometry of Pure Error
439(2)
Appendix 20A Generalized Inverses M-
441(3)
Exercises for Chapter 20
444(3)
21 More/Geometry of Least Squares
447(14)
21.1 The Geometry of Null Hypothesis: A Simple Example
447(1)
21.2 General Case H(0): A beta = c: The Projection Algebra
448(1)
21.3 General Illustrations
449(1)
21.4 The F-Test for H(0), Geometrically
450(2)
21.5 The Geometry of R(2)
452(1)
21.6 Change in R(2) for Models Nested Via A beta = 0 Not Involving beta(0)
452(2)
21.7 Multiple Regression with Two Predictor Variables as a Sequence of Straight Line Regressions
454(5)
Exercises for Chapter 21
459(2)
22 Orthogonal Polynomials and Summary Data
461(12)
22.1 Introduction
461(1)
22.2 Orthogonal Polynomials
461(6)
22.3 Regression Analysis of summary Data
467(2)
Exercises for Chapter 22
469(4)
23 Multiple Regression Applied to Analysis of Variance Problems
473(32)
23.1 Introduction
473(1)
23.2 The One-Way Classification: Standard Analysis and an Example
474(3)
23.3 Regression Treatment of the One-Way Classification Example
477(4)
23.4 Regression Treatment of the One-Way Classification Using the Original Model
481(5)
23.5 Regression Treatment of the One-Way Classification: Independent Normal Equations
486(2)
23.6 The Two-Way Classification with Equal Numbers of Observations in the Cells: An Example
488(1)
23.7 Regression Treatment of the Two-Way Classification Example
489(4)
23.8 The Two-Way Classification with Equal Numbers of Observations in the Cells
493(1)
23.9 Regression Treatment of the Two-Way Classification with Equal Numbers of Observations in the Cells
494(4)
23.10 Example: The Two-Way Classification
498(1)
23.11 Recapitulation and Comments
499(1)
Exercises for Chapter 23
500(5)
24 An Introduction to Nonlinear Estimation
505(62)
24.1 Least Squares for Nonlinear Models
505(3)
24.2 Estimating the Parameters of a Nonlinear System
508(10)
24.3 An Example
518(11)
24.4 A Note on Reparameterization of the Model
529(1)
24.5 The Geometry of Linear Least Squares
530(9)
24.6 The Geometry of Nonlinear Least Squares
539(4)
24.7 Nonlinear Growth Models
543(7)
24.8 Nonlinear Models: Other Work
550(3)
24.9 References
553(1)
Exercise for Chapter 24
553(14)
25 Robust Regression
567(18)
25.1 Least Absolute Deviations Regression (L(1)Regression)
567(1)
25.2 M-Estimators
567(6)
25.3 Steel Employment Example
573(2)
25.4 Trees Example
575(2)
25.5 Least Median of Squares (LMS) Regression
577(1)
25.6 Robust Regression with Ranked Residuals (rreg)
577(3)
25.7 Other Methods
580(1)
25.8 Comments and Opinions
580(1)
25.9 References
581(3)
Exercises for Chapter 25
584(1)
26 Resampling Procedures (Bootstrapping)
585(8)
26.1 Resampling Procedures for Regression Models
585(1)
26.2 Example: Straight Line Fit
586(2)
26.3 Example: Planar Fit, Three Predictors
588(1)
26.4 Reference Books
588(1)
Appendix 26A Sample MINITAB Programs to Bootstrap Residuals for a Specific Example
589(1)
Appendix 26B Sample MINITAB Programs to Bootstrap Pairs for a Specific Example
590(1)
Additional Comments
591(1)
Exercises for Chapter 26
591(2)
Bibliography 593(12)
True/False Questions 605(4)
Answers to Exercises 609(75)
Tables
684(11)
Normal Distribution
684(2)
Percentage Points of the t-Distribution
686(1)
Percentage Points of the X(2)-Distribution
687(1)
Percentage Points of the F-Distribution
688(7)
Index of Authors Associated with Exercises 695(2)
Index 697

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

Digital License

You are licensing a digital product for a set duration. Durations are set forth in the product description, with "Lifetime" typically meaning five (5) years of online access and permanent download to a supported device. All licenses are non-transferable.

More details can be found here.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.