Public Choice

, Volume 150, Issue 1–2, pp 77–95 | Cite as

Fiscal effects of budget referendums: evidence from New York school districts

  • Phuong Nguyen-Hoang


This paper provides empirical evidence on how budget referendums affect school inputs by taking advantage of an exogenous enactment of budget referendums for small city school districts (SCSDs) in New York State in 1998. The paper shows that SCSDs reduce instructional spending and increase student-teacher ratios while preserving administrative spending in response to budget referendums. These empirical findings are obtained by difference-in-differences estimations on data processed with propensity score matching, and the results are robust to sensitivity analysis.


Budget referendums School spending Propensity score matching Difference-in-differences Small city school districts 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Ahmed, S., & Greene, K. V. (2000). Is the median voter a clear-cut winner?: Comparing the median voter theory and competing theories in explaining local government spending. Public Choice, 105(3), 207–230. CrossRefGoogle Scholar
  2. Andrews, M., Duncombe, W., & Yinger, J. (2002). Revisiting economies of size in American education: are we any closer to a consensus? Economics of Education Review, 21(3), 245–262. CrossRefGoogle Scholar
  3. Angrist, J. D., & Krueger, A. B. (2000). Empirical strategies in labor economics. In O. Ashenfelter & D. Card (Eds.), Handbook of labor economics (Vol. 3). Amsterdam: Elsevier. Google Scholar
  4. Bails, D., & Tieslau, M. A. (2000). The impact of fiscal constitutions on state and local expenditures. Cato Journal, 20(2), 255–277. Google Scholar
  5. Bergstrom, T. C., & Goodman, R. P. (1973). Private demands for public goods. American Economic Review, 63(3), 280–296. Google Scholar
  6. Bertrand, M., Duflo, E., & Mullainathan, S. (2004). How much should we trust differences-in-differences estimates? Quarterly Journal of Economics, 119(1), 249–275. CrossRefGoogle Scholar
  7. Bifulco, R., Duncombe, W., & Yinger, J. (2005). Does whole-school reform boost student performance? The case of New York City. Journal of Policy Analysis and Management, 24(1), 47–72. CrossRefGoogle Scholar
  8. Blundell, R., & Dias, M. C. (2002). Alternative approaches to evaluation in empirical microeconomics. Portuguese Economic Journal, 1(2), 91–115. CrossRefGoogle Scholar
  9. Breton, A., & Wintrobe, R. (1975). The equilibrium size of a budget-maximizing bureau: a note on Niskanen’s theory of bureaucracy. Journal of Political Economy, 83(1), 195–207. CrossRefGoogle Scholar
  10. Bryson, A., Dorsett, R., & Purdon, S. (2002). The use of propensity score matching in the evaluation of active labour market policies. Controller of Her Majesty’s Stationary Office. Google Scholar
  11. Caliendo, M., & Kopeinig, S. (2008). Some practical guidance for the implementation of propensity score matching. Journal of Economic Surveys, 22(1), 31–72. CrossRefGoogle Scholar
  12. Carter, R. A. (2006). Public library law in New York state. New York Library Association. Google Scholar
  13. Downes, T. A., & Figlio, D. N. (2008). Tax and expenditure limits, school finance and school quality. In H. F. Ladd & E. B. Fiske (Eds.), Handbook of research in education finance and policy (pp. 373–388). Google Scholar
  14. Downes, T. A., Dye, R. F., & McGuire, T. J. (1998). Do limits matter? Evidence on the effects of tax limitations on student performance. Journal of Urban Economics, 43(3), 401–417. CrossRefGoogle Scholar
  15. Duncombe, W., & Yinger, J. (1997). Why is it so hard to help central city schools? Journal of Policy Analysis and Management, 16(1), 85–113. CrossRefGoogle Scholar
  16. Duncombe, W., & Yinger, J. (2001). Alternative paths to property tax relief. In W. E. Oates (Ed.), Property taxation and local government finance (pp. 243–294). Lincoln Institute of Land Policy. Google Scholar
  17. Duncombe, W., & Yinger, J. (2007). Does school district consolidation cut costs? Education Finance and Policy, 2(4), 341–375. CrossRefGoogle Scholar
  18. Duncombe, W., Miner, J., & Ruggiero, J. (1995). Potential cost savings from school district consolidation: a case study of New York. Economics of Education Review, 14(3), 265–284. CrossRefGoogle Scholar
  19. Duncombe, W., Lukemeyer, A., & Yinger, J. (2008). The no child left behind act: Have federal funds been left behind? Public Finance Review, 36(4), 381–407. CrossRefGoogle Scholar
  20. Dye, R. F., & McGuire, T. J. (1997). The effect of property tax limitation measures on local government fiscal behavior. Journal of Public Economics, 66(3), 469–487. CrossRefGoogle Scholar
  21. Ebdon, C. (2000). The effects of voter control on budget outcomes. Journal of Public Budgeting Accounting and Financial Management, 12(1), 22–42. Google Scholar
  22. Emsley, R., Lunt, M., Pickles, A., & Dunn, G. (2008). Implementing double-robust estimators of causal effects. Stata Journal, 8(3), 334–353. Google Scholar
  23. Falch, T., & Fischer, J. A. V. (2008). Public sector decentralization and school performance: international evidence. Working paper, Department of Economics, Norwegian University of Science and Technology. Google Scholar
  24. Farnham, P. G. (1990). The impact of citizen influence on local government expenditure. Public Choice, 64(3), 201–212. CrossRefGoogle Scholar
  25. Feld, L. P., & Kirchgässner, G. (1999). Public debt and budgetary procedures: top down or bottom up? Some evidence from Swiss municipalities. In J. P. Poterba & J. von Hagen (Eds.), Fiscal institutions and fiscal performance (pp. 151–179). Chicago: University of Chicago Press. Google Scholar
  26. Feld, L. P., & Kirchgässner, G. (2001). The political economy of direct legislation: direct democracy and local decision-making. Economic Policy, 16(33), 331–367. CrossRefGoogle Scholar
  27. Feld, L. P., & Matsusaka, J. G. (2003). Budget referendums and government spending: evidence from Swiss cantons. Journal of Public Economics, 87(12), 2703–2724. CrossRefGoogle Scholar
  28. Feld, L. P., & Schaltegger, C. A. (2005). Voters as a hard budget constraint: on the determination of intergovernmental grants. Public Choice, 123(1), 147–169. CrossRefGoogle Scholar
  29. Figlio, D. N. (1997). Did the ‘tax revolt’ reduce school performance? Journal of Public Economics, 65(3), 245–269. CrossRefGoogle Scholar
  30. Figlio, D. N. (1998). Short-term effects of a 1990s-era property tax limit: panel evidence on Oregon’s Measure. National Tax Journal, 51, 55–70. Google Scholar
  31. Figlio, D. N., & O’Sullivan, A. (2001). The local response to tax limitation measures: Do local governments manipulate voters to increase revenues?. Journal of Law and Economics, 44(1), 233–257. CrossRefGoogle Scholar
  32. Fischer, J. A. V. (2005). Do institutions of direct democracy tame the Leviathan? Swiss evidence on the structure of expenditure for public education. Working paper No. 2005-22, Department of Economics, University of St. Gallen. Google Scholar
  33. Fischer, J. A. V. (2007). The impact of direct democracy on public education: evidence for Swiss students in reading, mathematics and natural science. Working paper, Stockholm School of Economics. Google Scholar
  34. Fisher, R. C., & Papke, L. E. (2000). Local government responses to education grants. National Tax Journal, 53(1), 153–168. Google Scholar
  35. Freitag, M., & Vatter, A. (2006). Initiatives, referendums, and the tax state. Journal of European Public Policy, 13(1), 89–112. CrossRefGoogle Scholar
  36. Fuchs, T., & Wößmann, L. (2007). What accounts for international differences in student performance? A re-examination using PISA data. Empirical Economics, 32(2), 433–464. CrossRefGoogle Scholar
  37. Funk, P., & Gathman, C. (2008a). Does direct democracy reduce the size of government? New evidence from historical data, 1890–2000. Working paper. Google Scholar
  38. Funk, P., & Gathman, C. (2008b). Estimating the effect of direct democracy on policy outcomes: preferences matter! Working paper. Google Scholar
  39. Gu, X. S., & Rosenbaum, P. R. (1993). Comparison of multivariate matching methods: structures, distances, and algorithms. Journal of Computational and Graphical Statistics, 2(4), 405–420. CrossRefGoogle Scholar
  40. Hansen, B. B. (2004). Full matching in an observational study of coaching for the SAT. Journal of the American Statistical Association, 99(467), 609–618. CrossRefGoogle Scholar
  41. Heckman, J., & Robb, R. (1985). Alternative methods for evaluating the impact of interventions: An overview. Journal of Econometrics, 30(1–2), 239–267. CrossRefGoogle Scholar
  42. Heckman, J., Ichimura, H., Smith, J. A., & Todd, P. (1998). Characterizing selection bias using experimental data. Econometrica, 66(5), 1017–1098. CrossRefGoogle Scholar
  43. Heller, R., Manduchi, E., & Small, D. S. (2009). Matching methods for observational microarray studies. Bioinformatics, 25(7), 904–909. CrossRefGoogle Scholar
  44. Hilber, C. A., & Mayer, C. J. (2009). Why do households without children support local public schools? Linking house price capitalization to school spending. Journal of Urban Economics, 65(1), 74–90. CrossRefGoogle Scholar
  45. Ho, D. E., Imai, K., King, G., & Stuart, E. A. (2007). Matching as nonparametric preprocessing for reducing model dependence in parametric causal inference. Political Analysis, 15, 199–236. CrossRefGoogle Scholar
  46. Ho, D. E., Imai, K., King, G., & Stuart, E. A. (2008). MatchIt: Nonparametric preprocessing for parametric causal inference. Journal of Statistical Software. Google Scholar
  47. Holcombe, R. G., & Kenny, L. W. (2008). Does restricting choice in referenda enable governments to spend more? Public Choice, 136(1), 87–101. CrossRefGoogle Scholar
  48. Imbens, G. W. (2004). Nonparametric estimation of average treatment effects under exogeneity: a review. Review of Economics and Statistics, 86(1), 4–29. CrossRefGoogle Scholar
  49. Joffe, M. M., Ten Have, T. R., Feldman, H. I., & Kimmel, S. E. (2004). Model selection, confounder control, and marginal structural models. The American Statistician, 58(4), 272–279. CrossRefGoogle Scholar
  50. King, G., & Zeng, L. (2006). The dangers of extreme counterfactuals. Political Analysis, 14(2), 131–159. CrossRefGoogle Scholar
  51. King, G., & Zeng, L. (2007). When can history be our guide? The pitfalls of counterfactual inference. International Quarterly Studies, 51, 183–210. CrossRefGoogle Scholar
  52. Krol, R. (2007). The role of fiscal and political institutions in limiting the size of state government. Cato Journal, 27(3), 431–445. Google Scholar
  53. Ladd, H. F. (1998). How school districts respond to fiscal constraint. In W. J. Fowler (Ed.), Selected papers in school finance, 1996 (pp. 98–217). Washington: National Center of Education Statistics. Google Scholar
  54. LaManque, A. S. (1992). The fiscal implications of referendum budget voting for New York state small city school districts. ERIC (No. ED361856). Google Scholar
  55. LaManque, A. S. (1993). The choice of a school budget adoption mechanism for small city school districts in New York state: representative or referendum voting? Doctoral Dissertation, State University of New York at Albany. Google Scholar
  56. Matsusaka, J. G. (1995). Fiscal effects of the voter initiative: evidence from the last 30 years. Journal of Political Economy, 103(3), 587–623. CrossRefGoogle Scholar
  57. Matsusaka, J. G. (2000). Fiscal effects of the voter initiative in the first half of the twentieth century. Journal of Law and Economics, 43(2), 619–650. CrossRefGoogle Scholar
  58. McCall, H. C. (1995). Special report on municipal affairs for local fiscal years ended in 1994. New York State Office of the State Comptroller. Google Scholar
  59. McEachern, W. A. (1978). Collective decision rules and local debt choice: a test of the median-voter hypothesis. National Tax Journal, 31(2), 129–236. Google Scholar
  60. Megdal, S. B. (1983). The determination of local public expenditures and the principal and agent relation: a case study. Public Choice, 40(1), 71–87. CrossRefGoogle Scholar
  61. Meyer, B. D. (1995). Natural and quasi-experiments in economics. Journal of Business Economic Statistics, 13(2), 151–161. CrossRefGoogle Scholar
  62. Ming, K., & Rosenbaum, P. R. (2000). Substantial gains in bias reduction from matching with a variable number of controls. Biometrics, 56, 118–124. CrossRefGoogle Scholar
  63. Morgan, S. L., & Harding, D. J. (2006). Matching estimators of causal effects: prospects and pitfalls in theory and practice. Sociological Methods Research, 35(1), 3–60. CrossRefGoogle Scholar
  64. Nguyen-Hoang, P. (2010). Cost function and its use for intergovernmental educational transfers in Vietnam. Education Economics. doi: 10.1080/09645290903313087.
  65. Niskanen, W. A. (1968). Nonmarket decision making: the peculiar economics of bureaucracy. American Economic Review, 58(2), 293–305. Google Scholar
  66. NYS School Finance Reform (2006). New York State school aid budget process. Retrieved from
  67. Pommerehne, W. W. (1978). Institutional approaches to public expenditure: empirical evidence from Swiss municipalities. Journal of Public Economics, 9, 163–201. CrossRefGoogle Scholar
  68. Pommerehne, W. W., & Schneider, F. (1978). Fiscal illusion, political institutions, and local public spending. Kyklos, 31(3), 381–408. CrossRefGoogle Scholar
  69. Romer, T. (2004). The elusive median voter. In C. K. Rowley & F. Schneider (Eds.), The encyclopedia of public choice (pp. 211–213). Berlin: Springer. Google Scholar
  70. Romer, T., & Rosenthal, H. (1982). Median voters or budget maximizers: evidence from school expenditure referenda. Economic Inquiry, 20(4), 556–578. CrossRefGoogle Scholar
  71. Romer, T., Rosenthal, H., & Munley, V. G. (1992). Economic incentives and political institutions: spending and voting in school budget agenda. Journal of Public Economics, 49, 1–33. CrossRefGoogle Scholar
  72. Rosenbaum, P. R. (1991). A characterization of optimal designs for observational studies. Journal of the Royal Statistical Society. Series B, 53(3), 597–610. Google Scholar
  73. Rosenbaum, P. R., & Rubin, D. B. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika, 70(1), 41–55. CrossRefGoogle Scholar
  74. Rosenbaum, P. R., & Rubin, D. B. (1984). Reducing bias in observational studies using subclassification on the propensity score. Journal of the American Statistical Association, 79(387), 516–524. CrossRefGoogle Scholar
  75. Ross, S., & Yinger, J. (1999). Sorting and voting: a review of the literature on urban public finance. In P. Cheshire & E. S. Mills (Eds.), Handbook of urban and regional economics (Vol. 3, pp. 2001–2060). Amsterdam: North-Holland. Google Scholar
  76. Rubin, D. B. (1979). Using multivariate matched sampling and regression adjustment to control bias in observational studies. Journal of the American Statistical Association, 74(366), 318–328. CrossRefGoogle Scholar
  77. Santerre, R. E. (1989). Representative versus direct democracy: Are there any expenditure differences?. Public Choice, 60(2), 145–154. CrossRefGoogle Scholar
  78. Sass, T. R. (1991). The choice of municipal government structure and public expenditures. Public Choice, 71(1), 71–87. CrossRefGoogle Scholar
  79. Schaltegger, C. A. (2001). The effects of federalism and democracy on the size of government: evidence from Swiss sub-national jurisdictions. Ifo-Studien, 47, 145–162. Google Scholar
  80. Shadbegian, R. J. (2003). Did the property tax revolt affect local public education? Evidence from panel data. Public Finance Review, 31(1), 91–121. CrossRefGoogle Scholar
  81. Smith, J. (2004). Evaluating local economic development policies: theory and practice. In Evaluating local economic and employment development: how to assess what works among programmes and policies (pp. 287–333). Organization for Economic Cooperation and Development. Google Scholar
  82. Steunenberg, B. (1992). Referendum, initiative, and veto power: budgetary decision making in local government. Kyklos, 45(4), 501–529. CrossRefGoogle Scholar
  83. Stuart, E. A. (2007). Estimating causal effects using school-level data sets. Educational Researcher, 36(4), 187–198. CrossRefGoogle Scholar
  84. Stuart, E. A., & Green, K. M. (2008). Using full matching to estimate causal effects in nonexperimental studies: examining the relationship between adolescent marijuana use and adult outcomes. Developmental Psychology, 44(2), 395–406. CrossRefGoogle Scholar
  85. Stuart, E. A., & Rubin, D. B. (2008). Best practices in quasi-experimental designs: matching methods for causal inference. In J. W. Osborne (Ed.), Best practices in quantitative methods. Thousand Oaks: Sage. Google Scholar
  86. Wagschal, U. (1997). Direct democracy and public policymaking. Journal of Public Policy, 17(2), 223–245. CrossRefGoogle Scholar
  87. Wößmann, L. (2001). Why students in some countries do better: International evidence on the importance of education policy. Education Matters, 1(2), 67–74. Google Scholar
  88. Wößmann, L. (2003). Schooling resources, educational institutions and student performance: the international evidence. Oxford Bulletin of Economics and Statistics, 65(2), 117–170. CrossRefGoogle Scholar
  89. Zellner, A. (1986). Biased predictors, rationality and the evaluation of forecasts. Economics Letters, 21(1), 45–48. CrossRefGoogle Scholar
  90. Zhao, Z. (2008). Sensitivity of propensity score methods to the specifications. Economics Letters, 98(3), 309–319. CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Maxwell School of Citizenship and Public AffairsSyracuse UniversitySyracuseUSA
  2. 2.Graduate Program in Urban and Regional PlanningUniversity of IowaIowa CityUSA
  3. 3.Public Policy CenterUniversity of IowaIowa CityUSA

Personalised recommendations