Multiple Regression

• Kunio Takezawa
Chapter

Abstract

When data $$\{(x_{i1},x_{i2},\ldots,x_{iq},y_{i})\}$$ (1 ≤ i ≤ n) are given, multiple regression derives the values of {a j }(1 ≤ j ≤ q) by minimizing
$$\displaystyle{ \mathit{RSS} =\sum _{ i=1}^{n}{(y_{ i} - a_{0} -\sum _{j=1}^{q}a_{ j}x_{ij})}^{2} =\sum _{ i=1}^{n}e_{ i}^{2}. }$$
Here $$e_{i} = y_{i} - a_{0} -\sum _{j=1}^{q}a_{j}x_{ij}$$ are called residuals. The acronym RSS stands for the residual sum of squares. Equation (4.1) yields the regression equation:
$$\displaystyle{ \hat{y} = \hat{a}_{0} +\sum _{ j=1}^{q}\hat{a}_{ j}x_{j}, }$$
where $$\{\hat{a}_{j}\}$$ are estimates of the regression coefficients, {x j } the predictor variables, and $$\hat{y}$$ the estimates of target variables. Using Eq. (4.2), Eq. (4.1) is transformed into
$$\displaystyle{ \mathit{RSS} =\sum _{ i=1}^{n}{(y_{ i} -\hat{a}_{0} -\sum _{j=1}^{q}\hat{a}_{ j}x_{ij})}^{2}. }$$

Keywords

Null Hypothesis Regression Coefficient Simulation Data Probability Density Function Alternative Hypothesis
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

1. 1.
Bickel PJ, Doksum KA (2000) Mathematical statistics: basic ideas and selected topics, vol I, 2nd edn. Prentice Hall, Englewood CliffsGoogle Scholar
2. 2.
Myers RH (1990) Classical and modern regression with applications (Duxbury Classic). Duxbury, North ScituateGoogle Scholar
3. 3.
Ryan TP (1996) Modern regression methods. Wiley-Interscience, New YorkGoogle Scholar
4. 4.
Takezawa K (2006) Introduction to nonparametric regression. Wiley, New York
5. 5.
Takezawa K (2012) Flexible model selection criterion for multiple regression. Open J Stat 2(4):401–407
6. 6.
Takezawa K (2012) Guidebook to R graphics using microsoft. Wiley, New York