Search form

Jump-Start Your School's Program Evaluation: Part 1

Do the words "program evaluation" strike fear in the hearts of your school's staff? Many schools have so many programs, strategies and practices underway that they are overwhelmed by the prospect of assessing their effectiveness. Yet evaluating impact is the best way to avoid wasting time and money.

To support schools in this task, EducationWorld is pleased to present professional development resources including article content and a planning worksheet shared by The Governor's Prevention Partnership, a Connecticut-based nonprofit organization.

Want to dig deeper? See Part 2 of this article for The Partnership's training exercise, which guides staff in shaping a school's comprehensive program evaluation plan.

school program evaluation

The Governor’s Prevention Partnership

A nonprofit with a mission to keep Connecticut’s youth safe, successful and drug-free, The Partnership helps schools, communities, colleges and businesses create and sustain quality prevention programs. The organization also provides resources for school bullying prevention.


Gone are the days when “feeling good” about your efforts is sufficient justification for continuing expensive and labor-intensive activities.

Following are five steps to help you ease into the process of program evaluation. Ideally, your school should have a plan for evaluating not only large, complex programs, but also smaller and simpler strategies and practices.

1. Define Terms

The first step is defining terms. In this article, "effort" is the generic term used to refer to any program, strategy or practice. Efforts are broken down into three types, as follows.

Programs are defined as high-effort undertakings based on a set of packaged resources that outline a series of prescribed activities. One example is the Olweus Bullying Prevention Program. Quality programs are evidence-based, meaning that prior research has indicated their effectiveness. In addition, programs often include evaluation tools such as surveys.

A strategy is less prescribed than a program and refers to any well-planned activity that aims to accomplish a goal or solve a problem. Differentiated instruction is an example of such a strategy. Strategies should also be evidence-based, although typically there are a looser set of best practices, rather than prescribed activities, guiding day-to-day efforts.

Finally, a practice consists of an informal, lower-effort activity that is nonetheless aimed at a specific goal and consistent with a school's strategies and programs. An example is requiring teachers schoolwide to stand in the hallways during student passing time, in order to strengthen connections with students and encourage positive behavior.

This article will use "program evaluation" as a generic term referring to evaluation of any of the above types of efforts.

2. Get the Lay of the Land

Gather your school Data Team--the group that is responsible for managing data collection and analysis. You may call this a Response to Intervention (RTI) Team, Positive Behavioral Interventions and Supports (PBIS) Team, Student Intervention Team, or even School Climate Team. What's important is that team members have the necessary expertise to address both academic and social-emotional programs, practices and strategies in your school. Members might include the principal; special education director; literacy coach; one or more classroom teachers (perhaps one representing each grade level); school counselor; school psychologist or social worker; speech pathologist, occupational therapist or other specialists; point person for bullying (if your state requires such a person); data analyst or database manager. Some teams even include parents and students.

Many resources are available to help guide the work of the Data Team. Examples include:

Top Five Tips for Effective Data Teams
Best Practices for Response to Intervention (RTI) Teams
RTI Data team Process
Student Intervention Team Manual

The next step is having the team turn a critical eye toward all programs, practices and strategies in your school that are aimed at addressing barriers to learning--in other words, efforts that attempt to enhance, improve, supplement, remediate or prevent something. Ultimately you will want to catalog these many efforts, making a concrete list. Schools generally do a better job tracking academic programs, practices and strategies than they do tracking social-emotional or school-climate efforts. It is important, however, to include social-emotional efforts in your list. Likewise, you should include school-wide efforts as well as those aimed at individual struggling students. Some questions to consider when reviewing existing programs, practices and strategies:

  • When did each effort start?
  • What level of time, manpower and dollars does each require?
  • What is the purpose/goal of each?
  • Where do efforts fall in terms of the three-tiered classification system (universal, secondary and tertiary) used in the Response to Intervention (RTI) and Positive Behavioral Interventions and Supports (PBIS) frameworks?
  • Are there combinations of larger and smaller efforts that can be grouped because they share a common goal?

3. Do a Little Pruning

Armed with an exhaustive list of every single effort, small or large, in your school, you'll be able to identify fragmented and poorly coordinated efforts and then eliminate efforts that sap too many resources, or that are not aligned with any concrete goal. Remember that if you consider a program, strategy or practice not important enough to spend time evaluating, the effort may not be worth continuing to implement. Cross items off the list as you go, eliminating time-wasters, duplicative activities and efforts that cannot be evaluated.

NOTE: In addition to academic support, key dimensions of school climate identified by the National School Climate Center include (1) Physical and Emotional Safety, (2) Teaching/Learning of Social, Emotional and Civic Skills, (3) Strengthening Interpersonal Relationships and (4) Physical Surroundings and School Connectedness. If you find that any of your school's efforts do not relate either to academic support or one of the four school climate dimensions, it may indicate that these efforts are not worth continuing. The dimensions are included for reference in The Governor's Prevention Partnership's Evaluation Planning Worksheet, which you'll use in the next step.

Once you've done some "pruning" of your list, you'll be ready to evaluate the remaining efforts. The pruning process may also have uncovered some gaps that you intend to fill with future programs, practices and strategies.

4. Plan to Evaluate Everything on the List

In order to paint the most complete picture, a comprehensive school evaluation plan should cover every effort on your list (hopefully you've shortened this list with some thoughtful pruning). The Governor's Prevention Partnership's Evaluation Planning Worksheet can help you get the process started. The worksheet walks you through identifying the purpose of each effort and choosing short-term and long-term methods of measuring and tracking its impact on students. Consider asking each member of your Data Team to complete the worksheet for a different type or group of programs, strategies and practices.

The sophistication level of evaluation for each effort should reflect the sophistication level of the program, practice or strategy. Here is an example: Evaluating a formal effort such as the Olweus Bullying Prevention Program will involve administering long annual school-wide surveys to multiple informant groups and then analyzing the resulting data report. Another example: Evaluating a more informal practice of holding student advisory periods (to promote positive school climate and student social-emotional development) might require only a five-question student survey or quick student feedback session twice a year.

Efforts that you have grouped together in step 2 might share a common method of measurement. For instance, you might use the same school-wide survey to assess both the impact of staff bullying prevention training and the impact of a new set of school-wide rules that you implemented at about the same time.

Remember: It's important to take a baseline measure of a problem before implementing new efforts to address that problem. If this is not possible because an effort has been in place for some time without being evaluated, put the measurement instrument or mechanism in place as soon as possible and begin to plan ahead for the next data collection, which will provide a point of comparison.

It's a good idea to consult an evaluation expert at the point that you're ready to choose your methods of measurement. An expert can help you choose good measurement tools (e.g., not all surveys are created equal, and there is some skill involved in conducting a focus group). He or she also can plan optimal data collection times and determine how data from different time points will be compared (e.g., will statistically significant differences be the standard for judging change?).

5. Complete Your Evaluation Plan

Keep in mind the best practices below as you use the Evaluation Planning Worksheet to flesh out your evaluation plan. You likely will not apply every best practice to evaluation of every effort, but taken as a whole, your plan should do the following:

  • Schedule the evaluation in advance and measure at the right time to “catch” good effects. If possible, conduct one or more “baseline” (pretest) measurements as well as several posttest measurements spaced over time.
  • Be feasible (someone has the time to collect necessary data on an ongoing basis; school data team is in place to help plan and analyze).
  • Measure at multiple levels (individual, small-group, class, grade and population/school levels). Measure at the group level to evaluate efforts that reach smaller groups; measure at the population level to evaluate efforts that reach the entire school. (Measurements at the population level tend to be done less frequently [e.g., annually or every other year] compared to measurements at other levels.)
  • Use multiple informants (students, parents and teachers).
  • Use multiple formal and informal data collection tools (e.g., observation, record review, survey, interview).
  • Track process indicators (reflect upon how implementation is going) so that corrections can be made early in the process.
  • Track both short-term and long-term outcome indicators (assess what immediate and longer-term effects efforts are having on students).
  • Collect subjective, self-reported and qualitative outcome data as well as objective, observable, quantitative outcome data, using instruments with established reliability and validity.

Related resources

See Part 2 of this article for The Partnership's training exercise. The exercise, sure to be a conversation starter, asks participants to critique a fictional school's evaluation plan and suggest improvements.


Article by Celine Provini, EducationWorld Editor
Education World®             
Copyright © 2011 Education World