Where to Begin and How to Proceed

  • Robert L. Schalock

Overview

This chapter assumes that you are an OBE producer and want to know where to begin to evaluate your program, and how to proceed in a logical way. Most of my program administrator friends either want or need to evaluate their programs, but are often lost as to what this actually means. In this regard, there are two questions that you will want to ask yourself. First, “For what purpose will I use the outcome evaluation data?” Answering this question requires that you be proactive and know where you are going with your evaluation. As we saw in Table 1.2, the three primary purposes of OBE are effectiveness, impact, and benefit-cost analysis.

Second, “What data will I need for the intended use?” Answering this question requires logical thinking. Here the tendency is to ask questions that the education or social program and data system are unable to answer. For example, many program administrators want to know whether their program is better than an alternative or is cost-beneficial, without having the capability of forming comparison conditions or having the necessary cost and outcome data required to answer the question. If, on the other hand, you want to use the data on an ongoing basis for reporting outcomes, longitudinal evaluations, program change, or policy evaluations, then you will need to be sure that your data management system has the capability for such efforts. And that is one of the primary purposes of this chapter—to familiarize you with the relationship between outcome based evaluation data use (Question 1) and data requirements (Question 2).

The chapter begins with a brief overview of what I propose as “OBE guiding principles.” The chapter is then organized around the OBE model presented in Figure 1.2 with its major components of mission/goals, services, person-referenced outcomes, formative feedback, and uses of OBE data. These model components translate into four actions steps that compose the major sections of the chapter: (1) develop the program’s mission and goals; (2) provide services that are consistent with the program’s mission and goals; (3) select valued, person-referenced outcomes; and (4) establish a data management system.

Keywords

Data Management System Mission Statement Vocational Rehabilitation Internal Evaluation Formative Feedback 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Additional Readings

  1. Armstrong, M. I., Huz, S., & Evans, M. E. (1992). What works for whom: The design and evaluation of children’s mental health services. Social Work Research and Abstracts, 28(1), 35–41.PubMedCrossRefGoogle Scholar
  2. Bolton, B. (1987). Outcome analysis in vocational rehabilitation. In M. J. Fuhrer (Ed.), Rehabilitation outcomes: Analysis and measurement (pp. 57–69). Baltimore: Brookes.Google Scholar
  3. Brown, A. C. (1993). Revitalizing “handicap” for disability research: Developing tools to assess progress in quality of life for persons with disabilities. Journal of Disability Policy Studies, 4(2), 57–76.CrossRefGoogle Scholar
  4. DeStefano, L. (1986). Designing and implementing program evaluation. In F. R. Rusch (Ed.), Supported employment (pp. 229–247). Sycamore, IL: Sycamore.Google Scholar
  5. Eisen, S. V., Dill, D. L., & Grob, M. C. (1994). Reliability and validity of a brief patient-report instrument for psychiatric outcome evaluation. Hospital and Community Psychiatry, 45(3), 242–247.PubMedGoogle Scholar
  6. George, M. P., George, N. L., & Grosenick, J. K. (1990). Features of program evaluation in special education. Remedial and Special Education, 11(5), 23–30.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 1995

Authors and Affiliations

  • Robert L. Schalock
    • 1
  1. 1.Hastings CollegeHastingsUSA

Personalised recommendations