Abstract
Program evaluation draws on a set of social science procedures to systematically collect, analyze, interpret and communicate descriptive and explanatory information about social programs. Until recently evaluation texts tended to detail a tool chest of methods available to the evaluator without concomitant attention to how and when various methods should be used in practice. Initially, evaluators drew on existing methods and theories from the academic disciplines, typically in the social sciences, in which they were trained. However, as a transdiscipline evaluators not only adapted concepts and methods from their disciplines of origin, they also invented or combined methods in a new ways to achieve evaluation purposes. (Scriven 1991; Shadish/Cook/Leviton 1991). Evaluation work is practiced within substantive disciplinary domains, such as, health, education, criminal justice, employment, international aid, and others. Strong linkages between social science theory and theories about the program to be evaluated are necessary (Riggin 1990). Domain-specific evaluator practices are responsive to their contextually unique circumstances.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bentz, V. M./ Shapiro, J. J. (1998):Mindful Inquiry in Social Research. Thousand Oaks, CA: Sage.
Berg, B. L.(2000): Qualitative Research Methods for the Social Sciences. (4th Ed.). Need-ham Heights, MA: Allyn & Bacon.
Bickman, L. (Ed.) (2000): Validity & Social Experimentation: Don Campbell’s Legacy. (Vol. 1). Thousand Oaks, CA: Sage.
Bickman, L. (Ed.) (2000): Research Design: Don Campbell’s Legacy. (Vol. 2). Thousand Oaks, CA: Sage.
Bickman, L./ Rog, D. J. (Eds.) (1998): Handbook of Applied Social Research Methods. Thousand Oaks, CA: Sage.
Blalock, H. M. Jr. (1982): Conceptualization and Measurement in the Social Sciences. Newbury Park, CA: Sage.
Brewer, J./ Hunter, A. (1989): Multimethod Research: A Synthesis of Styles. Thousand Oaks, CA: Sage.
Campbell, D. T./ Fiske, D. W. (1959): Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56, 1959, pp. 81–105.
Caracelli, V. (1999): Strengthening Quality at GAO: The Interface of Contemporary Auditing and Evaluation Professions. Paper presented at the annual meeting of the American Evaluation Association, Orlando, Florida. November 4, 1999.
Caracelli, V. J./ Greene, J. C. (1993): Data Analysis Strategies for Mixed-Method Evaluation Designs. Educational Evaluation and Policy Analysis, 15 (no. 2), Washington, D.C.: American Educational Research Association, 1993, pp. 195–207.
Caracelli, V. J./ Greene, J. C. (1997): Crafting Mixed-Method Evaluation Designs. In: Greene, J. C./ Caracelli, V. J. (Eds.): Advances in Mixed-Method Evaluation: The Challenges and Benefits of Integrating Diverse Paradigms. New Directions for Evaluation, No. 74, San Francisco: Jossey-Bass. pp. 19–32.
Chelimsky, E. (1996): From Incrementalism to Ideology and Back: Can Producers of Policy Information Adjust to the Full Spectrum of Political Climates. Distinguished Public Policy Lecture Series, 1996. Center for Urban Affairs and Research, Evanston, Illinois (February 29, 1996).
Chelimsky, E. (1997): The Coming Transformations in Evaluation. In: Chelimsky, E./ Shadish, W. (Eds.): Evaluation for the 21st Century. Thousand Oaks, CA: Sage. pp. 1–26.
Chelimsky, E. (1997): The Political Environment of Evaluation and What it Means for the Development of the Field. In Chelimsky, E./ Shadish, W. R.: Evaluation for the 21st Century: A Handbook. Thousand Oaks, CA: Sage. pp. 53–68.
Chen, H.-T. (1990): Theory-driven Evaluations. Thousand Oaks, CA: Sage.
Chen, H.-T./ Rossi, P. H. (1987): The theory-driven approach to validity. Evaluation and Program Planning, 10, New York, NY: Pergamon Press, 1987, pp. 95–103.
Cook, T. D./ Campbell, D. T. (1979): Quasi-experimentation: Design and Analysis Issues for Field Settings. Chicago: Rand McNally College Publishing Co..
Cook, T. D. (1985): Postpositivist critical multiplism. In: Shotland, R L./ Mark, M. M. (Eds.): Social Science and Social Policy. Beverly Hills, CA: Sage. pp. 21–62.
Cook, T. D./ Cooper, H./ Cordray, D. S./ Hartmann, H./ Hedges, L. V./ Light, R. J./ Louis, T. A./ Mosteller, F. (1992): Meta-analysis for Explanation: A casebook. New York, NY: Russell Sage Foundation.
Cooksy, L. J. (1999): The Meta-Evaluand: The Evaluation of Project TEAMS. American Journal of Evaluation. Vol. 20, No. 1, 1999, pp. 123–136.
Cooper, H./ Hedges, L. V. (Eds.) (1994): The Handbook of Research Synthesis. New York: Russell Sage Foundation.
Cousins, J. B./ Whitmore E. (1998): Framing Participatory Evaluation. In: Whitmore, E. (Ed.): Understanding and Practicing Participatory Evaluation. New Directions for Evaluation. No. 80, 1998, pp. 5–23.
Creswell, J. W. (1994): Research Designs: Qualitative and Quantitative Approaches. Thousand Oaks, CA: Sage.
Crotty, M. (1998): The Foundations of Social Research. Thousand Oaks, CA: Sage.
Cronbach, L. J./Associates. (1980): Toward Reform of Program Evaluation. San Francisco: Jossey-Bass.
Datta, L-e. (1997): A Pragmatic Basis for Mixed-Method Designs. In Greene, J. C./ Caracelli, V. J. (Eds.): Advances in Mixed-Method Evaluation: The Challenges and Benefits of Integrating Diverse Paradigms. New Directions for Evaluation, No. 74. San Francisco: Jossey-Bass. pp. 33–45.
Datta, L-e. (2000): Seriously Seeking Fairness: Strategies for Crafting Non-partisan Evaluations in a Partisan World. American Journal of Evaluation, Vol. 21, No. 1, 2000, pp. 1–14.
Denzin, N. K. (1978): The Research Act: An Introduction to Sociological Methods (chap. 10). New York: McGraw-Hill.
Denzin, N. K. (1997): Interpretive Ethnography: Ethnographic practices for the 21st Century. Thousand Oaks, CA: Sage.
Denzin, N. K./ Lincoln, Y. S. (Eds.) (2000): Handbook of Qualitative Research (2nd. Ed.). Thousand Oaks, CA: Sage.
Dillman, D. (1999): Mail and Internet Surveys: The Tailored Design Method (2nd Ed.). John Wiley & Sons.
Dunn, W. N. (1994): Public Policy Analysis: An Introduction. (2nd Ed.). Englewood Cliffs, N]: Prentice Hall.
Erlandson, D. A./ Harris, E. L./ Skipper, B. L./ Allen, S. D. (1993): Doing Naturalistic Inquiry: A Guide to Methods. Newbury Park: Sage.
Fetterman, D. M./ Kaftarian, S. J./ Wandersman, A. (Eds.) (1996): Empowerment Evaluation: Knowledge and Tools for Self-Assessment & Accountability. Thousand Oaks, CA: Sage.
Fowler, F. J. (1993): Survey Research Methods (2nd Ed). Newbury Park, CA: Sage.
Guba, E. G. (Ed.) (1990): The Paradigm Dialog. Newbury Park, CA: Sage.
Greene, J. C. (1993): The Role of Theory in Qualitative Program Evaluation. In Flinders, D. J./ Mills, G. E. (Eds.): Theory and Concepts in Qualitative Research: Perspectives from the Field. New York, NY: Teachers College.
Greene, J. C. (1994): Qualitative Program Evaluation: Practice and Promise. In: Denzin, N. K./ Lincoln, Y. S. (Eds.): Handbook of Qualitative Research. Thousand Oaks, CA: Sage. pp. 530–544.
Greene, J. C. (1999): The inequality of performance measurements. Evaluation, 5(2), 1999, pp. 160–172.
Greene, J. C. (1999): Understanding Social Programs Through Evaluation. In: Denzin, N. K./ Lincoln, Y. S. (Eds.): Handbook of Qualitative Research (2nd. Ed.). Thousand Oaks, CA: Sage. pp. 981–999.
Greene, J. C./ Benjamin, L./ Goodyear, L./ Lowe, S. (1999): The Merits of Mixing Methods in Applied Social Research (Working Draft), APPAM Conference, Washington, D.C., Nov.6, 1999.
Greene, J. C./ Caracelli, V. J. (1997): Advances in Mixed-Method Evaluation: The Challenges and Benefits of Integrating Diverse Paradigms. New Directions for Evaluation, No. 74. San Francisco: Jossey-Bass.
Greene, J./ Caracelli, V. J. (1997): Defining and Describing the Paradigm Issue in Mixed-Method Evaluation. In: Greene, J. C./ Caracelli, V. J.: Advances in Mixed-Method Evaluation: The Challenges and Benefits of Integrating Diverse Paradigms. New Directions for Evaluation, No. 74. San Francisco, CA: Jossey-Bass.
Greene, J. C./ Caracelli, V. J./ Graham, W. F. (1989): Toward a Conceptual Framework for Mixed-method Evaluation Designs. Educational Evaluation and Policy Analysis, 11 (no. 3), 1989, pp. 255–274.
Groves, R. M. (1989): Survey Errors and Survey Costs. New York: John Wiley & Sons.
Hatry, H. (1999): Performance Measurement: Getting Results. Washington, D.C.: Urban Institute Press.
Hedrick, T. E./ Bickman, L./ Rog, D. J. (1993): Applied Research Design: A Practical Guideo Applied Social Research Methods Series, Vol 32. Thousand Oaks, CA: Sage.
Henry, G. T./ Julnes, G./ Mark, M. M. (Eds.) (1998): Realist Evaluation: An Emerging Theory in Support of Practice. New Directions for Evaluation, No. 78. San Francisco: Jossey-Bass.
Krueger, R. A. (1994): Focus Groups: A Practical Guide for Applied Research (2nd ed.). Thousand Oaks, CA: Sage.
Lavrakas, P. J. (1993): Telephone Survey Methods: Sampling, Selection, and Supervision. (2nd ed.). Applied Social Research Methods Series, V. 7. Newbury Park, CA: Sage.
Lincoln, Y. S./ Guba, E. G. (1985): Naturalistic Inquiry. Beverly Hills, CA: Sage.
Lipsey, M. W./ Wilson, D. B. (in press, 2000): Practical Meta-Analysis. Sage.
Lipsey, M. W. (1993): Theory as Method: Small Theories of Treatments. In: Sechrest. L. B./ Scott, G. G. (Eds.) Understanding Causes and Generalizing About Them. New Directions for Program Evaluation. No. 57, pp. 5–38.
Lipsey, M. W. (1997): What Can You Build With Thousands of Bricks? Musings on the cumulation of knowledge in Program Evaluation. In: Rog, D. J./ Fournier D. (Eds.) Progress and Future Directions in Evaluation: Perspectives on Theory, Practice, and Methods, New Directions for Evaluation, No. 76, pp. 7–23.
Mark, M. M. (1990): From Program Theory to Tests of Program Theory In: Bickman, L. (Ed.): Advances in Program Theory. New Directions for Program Evaluation, No. 47. San Francisco, CA: Jossey-Bass. pp.37–51.
Mark, M. M./ Henry, G. T./ Julnes, G. (in press, 2000): Evaluation: an Integrated Framewark for Understanding, Guiding, and Improving Policies and Programs. San Francisco, CA: Jossey-Bass.
Mark, M. M./ Shotland, R. L. (Eds.) (1987): Multiple Methods in Program Evaluation. New Directions for Program Evaluation (No. 35). San Francisco, CA: Jossey-Bass.
Marquart, J. M. (1990): A Pattern-Matching Approach to Link Program Theory and Evaluation Data. In: Bickman, L. (Ed.): Advances in Program Theory. New Directions for Program Evaluation, No. 47. San Francisco, CA: Jossey-Bass. pp. 93–107.
Mathison, S. (1988): Why Triangulate? Educational Researcher, 17(2), 13–17, 1988.
Maxwell, J. A. (1996): Qualitative Research Design: An Interactive Approach. Applied Social Research Methods Series, Vol. 41. Newbury Park, CA: Sage.
McKay, R. B. (1996): Cognitive Research in Reducing Nonsampling Errors in the Current Population Survey supplement on Race and Ethnicity. Proceedings of Statistics Canada Symposium 96: Nonsampling Errors. Ottawa, Ontario. pp. 107–117.
Mertens, D. M. (1997): Research Methods in Education and Psychology: Integrating Diversity with Quantitative and Qualitative Approaches. Thousand Oaks, CA: Sage.
Miles, M. B./ Huberrnan, A. M. (1994): Qualitative data analysis: An Expanded Sourcebook (2nd Ed.). Thousand Oaks, CA: Sage.
Mixed-Method Collaboration (1994). Mixed-Method Evaluation: Developing Quality Criteria through Concept Mapping. Evaluation Practice, 15 (no. 2),1994, pp. 139–152.
Newcomer, K. E. (Ed.) (1997): Using Performance Measurement to Improve Public and Nonprofit Programs. New Directions for Evaluation, no. 75. San Francisco, CA: Jossey-Bass.
Noblit, G. W./ Hare, R. D. (1998): Meta-Ethnography: Synthesizing Qualitative Studies, Qualitative Research Methods, No. 11. Thousand Oaks, CA: Sage.
Owen, J. M./ Rogers, P. J. (1999): Program Evaluation: Forms and Approaches. Thousand Oaks, CA: Sage.
Patton, M. Q. (1987): Evaluation’s Political Inherency: Practical implications for design and use. In: Palumbo, D. J. (Ed.): The Politics of Program Evaluation. Newbury Park, CA: Sage. pp. 100–145.
Patton, M. Q. (1990): Qualitative Evaluation and Research Methods (2nd Ed.). Thousand Oaks, CA: Sage.
Patton, M. Q. (1997): Utilization-Focused Evaluation: The New Century Text (3rd Edition). Thousand Oaks: CA: Sage.
Pawson, R./ Tilley, N. (1997): Realistic Evaluation. Thousand Oaks, CA: Sage.
Perrin, B. (1998): Effective Use and Misuse of Performance Measurement. American Journal of Evaluation, 19, No. 3, Greenwich, CT: JAI Press, Inc., pp. 367–379.
Posavac, E. J./ Carey, R. G. (1997): Program Evaluation: Methods and Case Studies, Upper Saddle River, N.J.: Prentice-Hall.
Preskill, H./ Torres, R. T. (1999): Evaluative Inquiry for Learning in Organizations. Thousand Oaks, CA: Sage.
Ragin, C. C. (1989): The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies. Berkeley, CA: University of California Press.
Reichardt, C. S./ Rallis, S. F. (Eds.) (1994): The Qualitative-Quantitative Debate: New Perspectives. New Directions for Program Evaluation, 61. San Francisco, CA: Jossey-Bass.
Riggin, L. J. C. (1990): Linking Program Theory and Social Science Theory. In: Bickman, L. (Ed.): Advances in Program Theory. New Directions for Program Evaluation No. 47. San Francisco, CA: Jossey-Bass.
Rossi, P. H.,/ Freeman, H. E./ Lipsey, M. W. (1999): Evaluation: A Systematic Approach (6th Edition.). Thousand Oaks, CA: Sage.
Schwarz, N./ Sudman, S. (Eds) (1995): Answering Questions: Methodology for Determining Cognitive and Communicative Processes in Survey Research. San Francisco, CA: Jossey-Bass.
Scriven, M. S. (1991): Evaluation Thesaurus (4th Ed.). Newbury Park, CA: Sage.
Sechrest, L./ Davis, M. F./ Stickle, T. R./ McKnight, P. E. (2000): Understanding „method“ variance. In: Bickman, L. (Ed.): Research Design: Don Campbell’s Legacy, Vol. 2. Thousand Oaks, CA: Sage. pp. 63–87.
Seidman, I. E. (1991): Interviewing as Qualitative Research. New York: Teachers College Press.
Shadish, W. R. Jr./ Cook, T. D./ Leviton, L. C. (1991): Foundations of Program Evaluation: Theories of Practice. Newbury Park, CA: Sage
Stake, R. E. (1995): The Art of Case Study Research. Thousand Oaks, CA: Sage.
Strauss, A./ Corbin, J. (1990): Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Newbury Park, CA: Sage.
Sudman, S./ Bradburn, N./ Schwarz, N. (1995): Thinking about Answers: The Application of Cognitive Processes to Survey Methodology. San Francisco, CA: Jossey-Bass.
Tashakkori, A./ Teddlie, C. (1998): Mixed Methodology: Combining Qualitiative and Quantitative Approaches. Applied Social Research Methods Series, Vol. 46. Thousand Oaks, CA: Sage.
The Joint Committee on Standards for Educational Evaluation (James R. Sanders, Chair) (1994). The Program Evaluation Standards: How to Assess Evaluations of Educational Programs (2nd edition). Thousand Oaks, CA: Sage.
Trochim, W. M. K. (1985): Pattern matching. validity, and conceptualization in program evaluation. Evaluation Review, 9 (5). Beverly Hills, CA: Sage. pp.575–604.
Trochim, W. M. K. (1989): Outcome pattern matching and program theory. Evaluation and Program Planning, 12. New York, NY: Pergamon Press. pp.355–366.
Trochim, W. M. K. (1999): The Research Methods Knowledge Base (2nd Ed.). Ithaca, NY: Cornell University.
U.S. General Accounting Office (1995). Program Evaluation: Improving the Flow of Information to Congress. PEMD-95-1. Washington, D.C.: GAO, January 30,1995.
U.S. General Accounting Office (2000). Managing for Results: Views on Ensuring the Usefulness of Agency Performance Information to Congress. GAO/GGD-00-35: Washington, D.C.: GAO, January 26, 2000.
U.S. General Accounting Office. Cholesterol Treatment: A review of the Clinical Trials Evidence. GAOIPEMD-96-7. Washington, D.C.: GAO, 1996.
U.S. General Accounting Office (1993). Developing and Using Questionnaires. GAO/PEMD-10.1.7. Washington, D.C.: GAO, 1993.
U.S. General Accounting Office (1992). The Evaluation Synthesis (PEMD-10.1.2). Washington, D.C.: GAO.
U.S. General Accounting Office (1991). Designing Evaluations (PEMD-10.1.4). Washington, D.C.: GAO.
U.S. General Accounting Office (1991). Using Structured Interviewing Techniques (PEMD-10.1.5). Washington, D.C.: GAO.
U.S. General Accounting Office (1990). Prospective Evaluation Methods: The Prospective Evaluation Synthesis (Transfer Paper 10.1.10). Washington, D.C.: GAO.
Vedung, E. (1997): Public Policy and Program Evaluation. New Brunswick, NJ: Transaction Publishers.
Worthen, B. R./ Sanders, J. R./ Fitzpatrick, J. L. (1997): Program Evaluation: Alternative Approaches and Practical Guidelines (2nd Ed.). White Plains, NY: Longman.
Webb, E. J./ Campbell, D. T./ Schwartz, R. D./ Sechrest, L. (2000): Unobtrusive Measures (revised edition). Sage Classics 2. Thousand Oaks, CA: Sage.
Webb, E./ Campbell, D. T./ Schwartz, R. D./ Sechrest, L. (1996): Unobtrusive Measures: Nonreactive Research in the Social Sciences. Chicago: Rand McNally.
Weiss, C. H. (1987): Where politics and evaluation research meet. In: Palumbo, D.J. (Ed.): The Politics of Program Evaluation. Thousand Oaks, CA: Sage. pp. 47–70.
Weiss, C. J. (1998): Evaluation (2nd Ed.). Upper Saddle River, NJ: Prentice Hall.
Wholey, J. S./ Hatry, H. P./ Newcomer, K. E. (Eds.) (1994): Handbook of Practical Program Evaluation. San Francisco, CA: Jossey-Bass.
Wholey, J. S. (1994): Assessing the feasibility and likely usefulness of evaluation. In: Wholey J. S./ Hatry, H. P./ Newcomer, K. E. (Eds.): Handbook of Practical Program Evaluation. San Francisco: Jossey-Bass. pp. 15–39.
Yin, R K (1994): Case Study Research: Designs and Methods: Applied Social Research Methods, vol 5. Thousand Oaks, CA: Sage.
Editor information
Rights and permissions
Copyright information
© 2000 Leske + Budrich, Opladen
About this chapter
Cite this chapter
Caracelli, V.J. (2000). Methodology: Building Bridges to Knowledge. In: Stockmann, R. (eds) Evaluationsforschung. Sozialwissenschaftliche Evaluationsforschung, vol 1. VS Verlag für Sozialwissenschaften, Wiesbaden. https://doi.org/10.1007/978-3-322-92229-8_7
Download citation
DOI: https://doi.org/10.1007/978-3-322-92229-8_7
Publisher Name: VS Verlag für Sozialwissenschaften, Wiesbaden
Print ISBN: 978-3-322-92230-4
Online ISBN: 978-3-322-92229-8
eBook Packages: Springer Book Archive