Abstract
Generalized linear models build a unified framework containing many extensions of a linear model. Important examples include logistic regression for binary responses, Poisson regression for count data or log-linear models for contingency tables. Penalizing the negative log-likelihood with the ℓ1-norm, still called the Lasso, is in many examples conceptually similar to the case with squared error loss in linear regression due to the convexity of the negative log-likelihood. This implies that the statistical properties as well as the computational complexity of algorithms are attractive. A noticeable difference, however, occurs with log-linear models for large contingency tables where the computation is in general much more demanding. We present in this chapter the models and estimators while computational algorithms and theory are described in more details in Chapters 4 and 6, respectively.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Bühlmann, P., van de Geer, S. (2011). Generalized linear models and the Lasso. In: Statistics for High-Dimensional Data. Springer Series in Statistics. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-20192-9_3
Download citation
DOI: https://doi.org/10.1007/978-3-642-20192-9_3
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-20191-2
Online ISBN: 978-3-642-20192-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)