Tracing the Spark that Lights a Flame: A Review of Methodologies to Measure the Outcomes of International Scholarships

  • Mirka Martel


This chapter reviews current methodologies used to measure and evaluate the effects of international scholarship programs for higher education. Just as these programs are diverse in their design, there is no “one size fits all” approach to evaluation. The overview of methodologies uses several analytic lenses and identifies their respective challenges: understanding the theory of change; choosing the unit of analysis; the timeline for evaluation; and approaches to quantitative and qualitative data collection. The chapter concludes with the importance of transparency in relaying evaluation outcomes to key audiences to improve programs and influence research in the field.


  1. Amos, L.B., Windham, A, de los Reyes, I.B., Jones, W., and Baran, V. (2009). Delivering on the promise: An impact evaluation of the Gates Millennium Scholars Program. Washington, DC: American Institutes for Research.Google Scholar
  2. Banerjee, A., Cole, S., Duflo, E. and Linden, L. (2005). Remedying education: Evidence from two randomized experiments in India, No. w11904, National Bureau of Economic Research. Available at: (Accessed 23 Jan 2017).
  3. Bamberger, M. (1999). Ethical Issues in Conducting Evaluation in International Settings. New Directions for Evaluation, 82, pp. 89–97.CrossRefGoogle Scholar
  4. Bamberger, M., Rugh, J. and Mabry, L. (2011). RealWorld evaluation: Working under budget, time, data, and political constraints. Thousand Oaks, CA: SAGE Publications, Inc.Google Scholar
  5. Bhandari, R. and Belyavina, R. (2011). Evaluating and measuring the impact of citizen diplomacy: Current status and future directions. New York: Institute of International Education.Google Scholar
  6. Boeren, A., Bakhuisen, K., Christian-Mak, A.M., Musch, V. and Pettersen, K. (2008). Donor policies and implementation modalities with regard to international postgraduate programmes targeting scholars from developing countries. Brussels: Vlaamse Interuniversitaire Raad.Google Scholar
  7. CARE. (2009). Girls’ leadership development in action: CARE’s experience from the field. Available from: (Accessed 1 August 2016).
  8. Center for Theory of Change. (2016). Center for Theory of Change home page. Available from: (Accessed 1 September 2016).
  9. Chen, P., Weiss, F. L., Nicholson, H. J. and Girls Incorporated (2010). Girls Study Girls Inc.: Engaging Girls in Evaluation through Participatory Action Research. American Journal of Community Psychology, 46(1–2), pp. 228–237.CrossRefGoogle Scholar
  10. Chesterfield, R., and Dant, W. (2013). Evaluation of LAC higher education scholarships program: Final report. Washington, DC: U.S. Agency for International Development.Google Scholar
  11. Chouinard, J.A. and Cousins, J.B. (2009). A Review and Synthesis of Current Research on Cross-Cultural Evaluation. American Journal of Evaluation, 30(4), pp. 457–494.CrossRefGoogle Scholar
  12. CIDA. (2005). Evaluation of the Canadian Francophone Scholarship Program (CFSP), 1987–2005. Quebec: CIDA.Google Scholar
  13. Cook, T.D., Scriven, M., Coryn, C.L. and Evergreen, S.D. (2009). Contemporary thinking about causation in evaluation: A dialogue with Tom Cook and Michael Scriven. American Journal of Evaluation, 31(1), pp. 105–117.CrossRefGoogle Scholar
  14. Cosentino, C., Rangarajan, A., Sloan, M., Fortson, J., Moorthy, A., Humpage-Liuzzi, S. and Thomas, C. (2015). Monitoring, Evaluation, and Learning Design for The MasterCard Foundation Scholars Program. Washington, DC: Mathematica Policy Research.Google Scholar
  15. Creed, C., Perraton, H and Waage, J. (2012). Examining development evaluation in higher education interventions: a preliminary study. London: London International Development Centre.Google Scholar
  16. Gertler, P.J., Martinez, S., Premand, P., Rawlings, L.B. and Vermeersch, C.M. (2011). Impact evaluation in practice. Washington, DC: The World Bank.Google Scholar
  17. Hansel, B. and Chen, Z. (2008). AFS long term impact study, 20 to 25 years after the exchange experience, AFS alumni are compared with their peers, Report 1. New York: AFS International.Google Scholar
  18. Hesse-Biber, S.N. (2010). Mixed methods research: Merging theory with practice. New York: Guilford Press.Google Scholar
  19. Hofmann-Pinilla, A. and Kallick Russell, J. (2009). Evaluation of the Leadership Development for Mobilizing Reproductive Health Program, Final Report Executive Summary. New York: Research Center for Leadership Action.Google Scholar
  20. Institute of International Education. (2015). Schlumberger Faculty for the Future: Program impact study. New York: Institute of International Education.Google Scholar
  21. Institute of International Education. (2016). LOTUS Annual Progress Report FY2016. Cairo, Egypt: IIE Cairo.Google Scholar
  22. Jadad, A.R. and Enkin, M. (2007). Randomized controlled trials: questions, answers, and musings (2nd ed). Malden, MA: Blackwell Publishing.CrossRefGoogle Scholar
  23. Kirkpatrick, D.L. (1979). Techniques for evaluating training programs. Training and Development Journal, 33 (6), pp. 78–92.Google Scholar
  24. Kirkpatrick, D. L. (1994). Evaluating training programs: the four levels. San Francisco: Berrett-Koehler.Google Scholar
  25. Kusek, J.Z. and Rist, R.C. (2004). Ten steps to a results-based monitoring and evaluation system: A handbook for development practitioners. Washington, DC: The World Bank.CrossRefGoogle Scholar
  26. Marsh, R., Baxter, A., Cliff, R., Di Genova, L., Jamison, A. and Madden, M. (2016). Career choices, return paths and social contributions: The African alumni project, Abridged report. Toronto: The MasterCard Foundation.Google Scholar
  27. Martel, M. and Bhandari, R. (2016). Social Justice and Sustainable Change: The Impacts of Higher Education, Ford Foundation International Fellowships Program Alumni Tracking Study Report No. 1. New York: Institute of International Education.Google Scholar
  28. Martinez, A., Epstein, C. and Parsad, A. (2015). Evaluation of the National Science Foundation’s Partnerships for International Research and Education (PIRE) Program, Volume 1: Final report. Cambridge, MA: Abt Associates.Google Scholar
  29. Mason, L., Powers, C. and Donnelly, S. (2015). The Boren Awards: A Report of Oral Language Proficiency Gains during Academic Study Abroad. New York, NY: Institute of International Education.Google Scholar
  30. Mawer, M. (2014). A study of research methodology used in evaluations of international scholarship schemes for higher education. London: Commonwealth Scholarship Commission in the United Kingdom.Google Scholar
  31. Mayne, J. (2011). Addressing cause and effect in simple and complex settings through contribution analysis, in Forss, K., Mara, M., and Schwartz, R. (eds), Evaluating the complex: Attribution, contribution and beyond. New York: Transaction Publishers.Google Scholar
  32. Perna, L., Orosz, K., Gopaul, B., Jumakulov, Z., Ashirbekov, A., and Kishkentayaeva, M. (2014). Promoting human capital development: A typology of international scholarship programs in higher education. Educational Researcher, 43(2), pp. 66–73.Google Scholar
  33. PPMI. (2012). Interim evaluation of Erasmus Mundus II (2009–2013), Brussels: European Commission Directorate-General of Education and Culture.Google Scholar
  34. Ramboll Management Consulting. (2012). Final report: Evaluation of NPT and NICHE. Berlin: Ramboll Management Consulting.Google Scholar
  35. Research Solutions International. (2016). The Benjamin A. Gilman International Scholarship Program Evaluation Report. Washington, DC: U.S. Department of State, Bureau of Educational and Cultural Affairs.Google Scholar
  36. Rotem, A., Zinovieff, M., and Goubarev, A. (2010). A framework for evaluating the impact of the United Nations fellowship programmes. Human Resources for Health, 8(7). Available at: (Accessed 23 Jan 2017).
  37. Scott, J., and Carrington, P. J. (2011). The SAGE handbook of social network analysis. Thousand Oaks, CA: SAGE publications.Google Scholar
  38. SRI International. (2005). Outcome assessment of the visiting Fulbright Student Program. Washington, DC: U.S. Department of State, Bureau of Educational and Cultural Affairs.Google Scholar
  39. The MasterCard Foundation. (2016). Education in Support of Social Transformation: Learning from the First Five Years of The MasterCard Foundation Scholars Program. Toronto: The MasterCard Foundation.Google Scholar
  40. Tvaruzkova, M. (2012). Evaluation: Advances in measuring the effectiveness of networks. Paper presented at the Global Leadership Consortium, December 11, 2012, Washington, DC.Google Scholar
  41. USAID. (2004). Generations of quiet progress: the development impact of US long-term university training on in Africa from 1963–2003, Report prepared by Aguirre International. Washington, DC: USAID.Google Scholar
  42. Uyeki, E. (1993). As Others See Us: A Comparison of Japanese and American Fulbrighters. New York: Institute of International Education.Google Scholar
  43. Valuy, E. and Martel, M. (2016). HER initiative to lead change: The power of education. New York: Institute of International Education.Google Scholar
  44. Valuy, E. (2016). Centroamerica Adelante Final Report. New York: Institute of International Education.Google Scholar

Copyright information

© The Author(s) 2018

Authors and Affiliations

  • Mirka Martel
    • 1
  1. 1.Institute of International Education (IIE)New YorkUSA

Personalised recommendations