Bowerman Regression CHPT 1
Bowerman Regression CHPT 1
Bowerman Regression CHPT 1
Regression Analysis
Unified Concepts, Practical
Applications, and Computer
Implementation
Bruce L. Bowerman, Richard T. OConnell, and
Emily S. Murphree
Abstract
Regression Analysis: Unified Concepts, Practical Applications, and Computer
Implementation is a concise and innovative book that gives a complete
presentation of applied regression analysis in approximately one-half the
space of competing books. With only the modest prerequisite of a basic
(non-calculus) statistics course, this text is appropriate for the widest
possible audience.
Keywords
logistic regression, model building, model diagnostics, multiple regression, regression model, simple linear regression, statistical inference, time
series regression
Contents
Prefaceix
Chapter 1
Chapter 4
Appendix A
Statistical Tables..........................................................253
References261
Index263
Preface
Regression Analysis: Unified Concepts, Practical Applications, and Computer
Implementation is a concise and innovative book that gives a complete
presentation of applied regression analysis in approximately one-half the
space of competing books. With only the modest prerequisite of a basic
(non-calculus) statistics course, this text is appropriate for the widest possible audienceincluding college juniors, seniors, and first year graduate
students in business, the social sciences, the sciences, and statistics, as
well as professionals in business and industry. The reason that this text
is appropriate for such a wide audience is that it takes a very unique and
integrative approach to teaching regression analysis. Most books, after a
short chapter introducing regression, cover simple linear regression and
multiple regression in roughly four chapters by beginning with a chapter
reviewing basic statistical concepts and then having chapters on simple
linear regression, matrix algebra, and multiple regression. In contrast, this
book, after a short chapter introducing regression, covers simple linear
regression and multiple regression in a single cohesive chapter, Chapter2,
by efficiently integrating the discussion of the two techniques. In addition, the same Chapter 2 teaches both the necessary basic statistical concepts (for example, hypothesis testing) and the necessary matrix algebra
concepts as they are needed in teaching regression. We believe that this
approach avoids the needless repetition of traditional approaches and
does the best job of getting a wide variety of readers (who might be students with different backgrounds in the same class) to the same level of
understanding.
Chapter 3 continues the integrative approach of the book by discussing more advanced regression models, including models using squared
and interaction terms, models using dummy variables, and logistic regression models. The book concludes with Chapter 4, which organizes the
techniques of model building, model diagnosis, and model improvement
into a cohesive six step procedure. Whereas many competing texts spread
such modeling techniques over a fairly large number of chapters that can
x PREFACE
seem unrelated to the novice, the six step procedure organizes both standard and more advanced modeling techniques into a unified presentation. In addition, each chapter features motivating examples (many real
world, all realistic) and concludes with a section showing how to use SAS
followed by a set of exercises. Excel, MINITAB, and SAS outputs are
used throughout the text, and the books website contains more exercises
for each chapter. The books website also houses Appendices B, C, and
D. Appendix B gives careful derivations of most of the applied results in
the text. These derivations are referenced in the main text as the applied
results are discussed. Appendix C includes an applied discussion extending the basic treatment of logistic regression given in the main text. This
extended discussion covers binomial logistic regression, generalized (multiple category) logistic regression, and Poisson regression. Appendix D
extends the basic treatment of modeling time series data given in the main
text. The Box-Jenkins methodology and its use in regression analysis are
discussed
Author Bruce Bowerman would like to thank Professor David
Nickerson of the University of Central Florida for motivating the writing
of this book. All three authors would like to thank editor Scott Isenberg,
production manager Destiny Hadley, and permissions
editor Marcy
Schneidewind, as well as the fine people at Exeter, for their hard work.
Most of all we are indebted to our families for their love and encouragement over the years.
Bruce L. Bowerman
Richard T. OConnell
Emily S. Murphree
CHAPTER 1
An Introduction to
Regression Analysis
1.1 Observational Data and Experimental Data
In many statistical studies a variable of interest, called the response variable
(or dependent variable), is identified. Data are then collected that tell us
about how one or more factors might influence the variable of interest.
If we cannot control the factor(s) being studied, we say that the data are
observational. For example, suppose that a natural gas company serving
a city collects data to study the relationship between the citys weekly
natural gas consumption (the response variable) and two factorsthe
average hourly atmospheric temperature and the average hourly wind
velocity in the city during the week. Because the natural gas company
cannot control the atmospheric temperatures or wind velocities in the
city, the data collected are observational.
If we can control the factors being studied, we say that the data
are experimental. For example, suppose that an oil company wishes
to study how three different gasoline types (A, B, and C) affect the
mileage obtained by a popular midsized automobile model. Here the
response variable is gasoline mileage, and the company will study a
single factorgasoline type. Since the oil company can control which
gasoline type is used in the midsized automobile, the data that the oil
company will collect are experimental.
REGRESSION ANALYSIS
Index
Adjusted coefficient of determination,
5657
Autocorrelated errors, 208216
Autoregressive model, 211
Backward elimination, 172174,
211
Biasing constant, 231
Bonferroni procedure, 134
Box-Jenkins methodology, 216
Error term, 11
Experimental region, 18, 24
Explained deviation, 53
264 Index
Index 265