Skip to main content

Linear Regression and Matrix Inversion

  • Chapter
  • First Online:
Numerical Analysis for Statisticians

Part of the book series: Statistics and Computing ((SCO))

  • 8566 Accesses

Abstract

Linear regression is the most commonly applied procedure in statistics. This fact alone underscores the importance of solving linear least squares problems quickly and reliably. In addition, iteratively reweighted least squares lies at the heart of a host of other optimization algorithms in statistics. The current chapter features four different methods for solving linear least squares problems: sweeping, Cholesky decomposition, the modified GramSchmidt procedure, and orthogonalization by Householder reflections. Later we take up solution by the singular value decomposition.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Dempster AP (1969) Continuous Multivariate Analysis. Addison-Wesley, Reading, MA

    MATH  Google Scholar 

  2. Golub GH, Van Loan CF (1996) Matrix Computations, 3rd ed. Johns Hopkins University Press, Baltimore

    MATH  Google Scholar 

  3. Goodnight JH (1979) A tutorial on the SWEEP operator. Amer Statistician 33:149-158

    Article  MATH  Google Scholar 

  4. Henrici P (1982) Essentials of Numerical Analysis with Pocket Calculator Demonstrations. Wiley, New York

    MATH  Google Scholar 

  5. Hubert L, Meulman J, Heiser W (2000) Two purposes for matrix factorization: a historical appraisal. SIAM Review 42:68-82

    Article  MATH  MathSciNet  Google Scholar 

  6. Jennrich RI (1977) Stepwise regression. Statistical Methods for Digital Computers. Enslein K, Ralston A, Wilf HS, editors, Wiley-Interscience, New York, pp 58-75

    Google Scholar 

  7. Kennedy WJ Jr, Gentle JE (1980) Statistical Computing. Marcel Dekker, New York

    MATH  Google Scholar 

  8. Little RJA, Rubin DB (1987) Statistical Analysis with Missing Data. Wiley, New York

    MATH  Google Scholar 

  9. Miller KS (1987) Some Eclectic Matrix Theory. Robert E Krieger Publishing, Malabar, FL

    MATH  Google Scholar 

  10. Press WH, Teukolsky SA, Vetterling WT, Flannery BP (1992) Numerical Recipes in Fortran: The Art of Scientific Computing, 2nd ed. Cambridge University Press, Cambridge

    Google Scholar 

  11. Seber GAF, Lee AJ (2003) Linear Regression Analysis, 2nd ed. Wiley, Hoboken, NJ

    MATH  Google Scholar 

  12. Stewart GW (1987) Afternotes on Numerical Analysis. SIAM, Philadelphia

    Google Scholar 

  13. Strang G (1986) Introduction to Applied Mathematics. Wellesley-Cambridge Press, Wellesley, MA

    MATH  Google Scholar 

  14. Thisted RA (1988) Elements of Statistical Computing. Chapman & Hall, New York

    MATH  Google Scholar 

  15. Trefethen LN, Bau D III (1997) Numerical Linear Algebra. SIAM, Philadelphia

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kenneth Lange .

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer New York

About this chapter

Cite this chapter

Lange, K. (2010). Linear Regression and Matrix Inversion. In: Numerical Analysis for Statisticians. Statistics and Computing. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-5945-4_7

Download citation

Publish with us

Policies and ethics