In a linear model \( {y_i} = {x'_i}\beta + {u_i} \) with E(ux) = 0, where ß is a k × 1 parameter vector of interest, u is the error term, x is a k × 1 regressor vector, and \( \left( {{{x'}_i},{y_i}} \right) \) are iid, the least squares estimator (LSE) for ß is obtained by minimizing
$$ \left( {1/N} \right){\sum\limits_i {\left( {{y_i} - {{x'}_i}b} \right)} ^2} $$
with respect to (wrt) b. LSE can also be viewed as the solution of the first-order (moment) condition of the minimization
$$ \left( {1/N} \right)\sum\limits_i {{x_i}\left( {{y_i} - {{x'}_i}b} \right)} = 0. $$


Asymptotic Property Moment Condition Little Square Estimator Seemingly Unrelated Regression Linear Hypothesis 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1996

Authors and Affiliations

  • Myoung-jae Lee
    • 1
  1. 1.Department of EconometricsTilburg UniversityTilburgThe Netherlands

Personalised recommendations