Abstract
In Chap. 11, we used multiple regression to analyze data with a single categorical predictor. Yet multiple regression, like analysis of variance, can also be used with designs that combine two or more categorical predictors. In most cases, the categorical variables are crossed to form a factorial design. In this chapter, you will learn how to use multiple regression to analyze and interpret factorial designs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
To be precise, the preceding definition defines a complete factorial design. Partial (or fractional) factorial designs, which will not be considered here, occur when only some combinations are represented.
- 2.
With categorical predictors, the slope of the regression line equals the difference between two means.
- 3.
If we had used dummy coding to create our grouping vectors, the b 1 coefficient would represent a simple effect rather than a main effect. For example, if we had assigned 0’s to those who did not lift weights and 1’s to those who did, the b 1 coefficient would represent the simple effect of cycling when no weights were lifted rather than the main effect of cycling across weightlifting conditions. Unfortunately, many researchers are unaware of this fact, leading them to erroneously interpret their dummy-coded coefficients as main effects rather than simple effects. To avoid confusion, you should not use dummy coding with a factorial design.
- 4.
With categorical predictors, it is customary to refer to the simple slopes as simple effects. The terms are equivalent because, as noted in footnote 2, a simple slope represents the difference between two group means.
- 5.
The values reported in Table 12.14 are the ones you get by default using the ANOVA program in most statistical packages. R is an exception, and the code you need to reproduce the table in R is provided at the end of this section.
- 6.
You might notice that V 3 starts with a positive value (.5) not a negative one, as was true in our earlier examples. When a factor has only two levels, this decision is largely arbitrary.
- 7.
The highlighted portion of the covariance matrix will be used in a subsequent section to construct an augmented covariance matrix.
- 8.
To avoid making Type I errors, it is customary to adjust your alpha levels when you report multiple comparisons. A discussion of this issue can be found in most introductory statistics texts.
- 9.
We will discuss the other two comparisons after we complete our discussion here.
- 10.
The boxed numbers reference the relevant comparison in Table 12.25.
- 11.
We would follow the same procedure if we had an A × B design of any size (e.g., 3 × 4). Use dummy coding for Factor B, setting one group to receive 0’s on all vectors, and rerun your regression analysis. The regression coefficient is the simple effect of A at the level of B that received 0’s on all vectors.
Author information
Authors and Affiliations
12.1 Electronic Supplementary Material
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Brown, J.D. (2014). Factorial Designs. In: Linear Models in Matrix Form. Springer, Cham. https://doi.org/10.1007/978-3-319-11734-8_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-11734-8_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11733-1
Online ISBN: 978-3-319-11734-8
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)