by Dr. Renae Cohen and Dr. Shira Rosenblatt
We were pleased to read the August 29th issue of eJewishPhilanthropy, which focused on the need for data to inform communal dialogue in a number of areas and highlighted some of the good work of JData. The importance of using data to inform has been at the forefront of the work of JESNA’s Berman Center for Research and Evaluation for nearly 20 years. In addition to basic data, we also emphasize the importance of quality evaluation research, which can help us know the impact of programs on participants. The results of evaluation at different stages in the life of programs will help inform and improve program planning and decision-making, and may facilitate promising directions for funding. It is our responsibility to continue to make the evaluation data we collect relevant, meaningful, and accessible.
Our key point is that while good empirical data is crucial, it is strengthened by an accompanying commitment to evaluation. There are a number of key ways that evaluation can be implemented, among them the following:
- Evaluation from the design stage: Optimally, evaluation is considered by key stakeholders even before a program is off the ground. Thinking “evaluatively” from the inception of a program’s design provides organizations with the potential for greatest impact. By beginning the design phase in conjunction with outlining what one really wants to learn and how one is going to learn, and ensuring a feedback loop for incorporating the learnings, an organization capitalizes on the benefits of evaluation. Evaluation commencing at this stage may include working to identify and articulate outcomes: immediate, short-term, and even long-term. It can also set the stage for designing feedback forms and/or conducting focus groups/interviews that will allow the organization to understand the immediate take-aways of its program, what’s working well, and/or what can improve. Ultimately, this design hopefully would include measures to evaluate impact on participants at critical junctures post-program.
- Evaluation using capacity-building: There are many times that a foundation or a funder offers small- to mid-size grants, and includes an evaluation requirement for the program- or year-end report. Many of these organizations may lack staff with the skills necessary to conduct meaningful evaluation and/or the organization may not have sufficient funds to hire an external evaluator to conduct the evaluation. A model that can be particularly effective in this situation is to offer workshops in combination with a limited number of coaching hours per grantee. In this way foundations/funders can feel comfortable that they will get more useful data about program impact. They also will know they are advancing the knowledge, skills, and capabilities of their grantees to be able to have more tools to design and conduct evaluations, to think more evaluatively overall, and to have many more resources/tools on hand for evaluation in the future.
- Evaluation of community initiatives: Most of the time evaluators focus their attention on individual programs and their impact. However, sometimes there is the desire to go beyond evaluation of individual program impact and broaden the question to focus on the complexity of a community initiative made up of multiple programs all aiming to affect one target population, such as families with young children. This innovative approach may involve frequent and collaborative conversations to better understand the data being collected, the way in which the parts of the whole are interacting with one another, and the impact on the whole with the understanding that the impact is coming from multiple directions.
These are just a few examples of the various ways in which evaluation can be applied to help programmers, funders, foundations, communities, and/or organizations capitalize on the evaluation process. Let’s collect strong, reliable empirical data, but let’s also undertake empirical evaluation. And let’s not just do evaluation because we are required to do so – let’s maximize our learning along the way.
Dr. Renae Cohen is Director and Dr. Shira Rosenblatt Associate Director of JESNA’s Berman Center for Research and Evaluation.