Abstract
In Chap. 2, we learned how to decompose a rectangular matrix into an orthonormal basis Q and an upper triangular matrix R, and in Chap. 3 we applied the decomposition to a linear regression model. In this chapter you will learn a related decomposition that can create an orthonormal basis from a square, symmetric matrix. The decomposition is known as the eigen decomposition, and it has applications across a range of problems in math, science, and engineering.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The eigen decomposition is sometimes called the spectral decomposition.
- 2.
All of the eigenvalues of a projection matrix are 1 or 0.
- 3.
By convention, the largest eigenvalue is designated first, with the size of each successive eigenvalue decreasing in magnitude.
- 4.
We can eliminate the denominator in Eq. (4.7) if our eigenvector is in unit length.
- 5.
The \( \mathcal{R} \) code that accompanies this section outputs an orthonormal matrix, P, such that P′P = I and A = PHP′.
- 6.
The version we will learn, known as the single-shift, Francis algorithm, is appropriate for matrices with real eigenvalues. A double-shift version is used for matrices with complex eigenvalues. Details can be found in Golub and van Loan (2013).
- 7.
Using the LU decomposition from Chap. 1, the \( \mathcal{R} \) code that accompanies this section can be set to also find the smallest eigen pair.
- 8.
This example is commonly used to illustrate the predator-prey model.
- 9.
The actual algorithm that Google uses is a bit more complicated than the one presented here. For example, they include a damping parameter to model the likelihood that the surfer will simply stay on the current page or exit her browser. The true value is proprietary, but the suspected value is .85.
- 10.
Because more than one eigenvector can be chosen to begin the decomposition, the Schur decomposition does not produce a unique solution.
- 11.
Reference
Golub, G. H., & van Loan, C. F. (2013). Matrix computations (4th ed.). Baltimore: John Hopkins.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Brown, J.D. (2018). Eigen Decomposition. In: Advanced Statistics for the Behavioral Sciences. Springer, Cham. https://doi.org/10.1007/978-3-319-93549-2_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-93549-2_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-93547-8
Online ISBN: 978-3-319-93549-2
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)