Opinion

A Behind-the-Scenes Look at an Evaluators Consortium

By Sandy Edwards and Stacie Cherner

In the midst of its tenth year, the Jim Joseph Foundation has created what might be called “a family of beneficiaries.” There are young Jews who have, as an example, traveled to Israel on Birthright, lived in a Moishe House, enrolled in HUC-JIR, Mechon Hadar, or other education institutions, and perhaps earned credentials as expert Israel educators as part of a program with the iCenter. All of these exceptional institutions and organizations are grantees of the Foundation.

Now, a new “family” is developing. And while these family members often operate behind-the-scenes, we believe they are equally as important to the Foundation’s pursuit of its mission. This family is comprised of a small number of highly skilled evaluators and researchers, which the Foundation works with as a “consortium” of evaluation advisors and providers. The “consortium” members, brought together as an experiment, complement one another’s strengths even as they all reach a consistently high level of execution. They operate collaboratively to share data, instruments, and best practices – and yet also compete as they bid for particular contracts to evaluate Foundation-supported initiatives and grantees.

There are very real benefits to forming this type of consortium that brings together – and keeps together – experts to work towards goals that undoubtedly will take years to accomplish.

At its inception, the Consortium was tasked with “framing and naming” the varieties of “Jewishness” and the parsing of those characterizations to develop a shared approach to measurement and documentation. This would ideally lead to individual evaluations becoming an extended family of connected and commensurable investigations. But, over a series of consultations, the Consortium’s goal evolved into something even bigger and more impactful: moving toward a common set of measures (survey items, interview schedules, frameworks for documenting distinctive features of programs) to be developed and used as outcomes and indicators of Jewish learning and growth for teens and young adults. No easy task, to say the least.

To further reflect on and pursue this new goal, the Consortium again convened last month. Members analyzed and discussed surveys of Jewish teens and young adults, which are being developed concurrently by a team that includes several members of the Consortium, funded by the Foundation. This includes the American Institutes for Research collaborative work with The Jewish Education Project and Rosov Consulting to develop a teen survey, a groundbreaking piece of work part of the cross-community evaluation of the Community Based Teen Education Initiative.

During the meeting, each Consortium member shared their work – from the teen surveys, to a survey being developed for Hillel, to the extensive work measuring Jewish learning and growth that resides in the NYU Berman Center’s Jewish Survey Question Bank. These efforts taken together are the building blocks for the common set of constructs and survey items for Foundation grantees. Their development would be a significant step forward for the field of Jewish education and for those who seek to effectively measure whether teen, college student, and young adult education and engagement initiatives – across different communities and different organizations – can be deemed effective.

Why is this important? Currently, simple survey questions are not asked in uniform ways to allow the Foundation (or the field in general) to look across populations (for example, participants in different programs, or teens in different communities) or to track participants across their many experiences (for example, Jews who participate in BBYO, then Hillel, then Moishe House). Complex outcomes related to Jewish learning and growth are not defined by similar metrics. All of this limits the Foundation’s ability to more deeply understand the outcomes achieved by the organizations funded.

We are excited to report that the Consortium is moving the needle in this important direction. The Foundation is in essence relying on the Consortium to support an effort to develop a coherent, interesting, persuasive and evidence-based account of what they, as evaluators, have learned about the Jewish learning and growth displayed by the “family of beneficiaries” described above: the teens, college students, and young adults who participate in the programs supported by the Foundation. Chip Edelsberg recently discussed Leap of Reason’s Performance Imperative, which offers social-sector organizations information, metrics, and tools to both measure and achieve “high performance.” The Foundation, too, strives to achieve more meaningful, measurable change, and the Consortium’s success is critical to the Foundation’s ability to hold itself accountable and to determine the success of our grantmaking strategies.

Over the next few months, the Foundation, with the Evaluator Consortium, will think deeply about the “sausage making”-type work of developing cross-community and cross-age evaluation metrics and survey tools. The Consortium will draw on the remarkable collective expertise of its members to develop a plan that builds on the teen and young adult Jewish learning and growth outcomes already known. Big questions need to be answered: What does an ideal “report” on the Foundation’s contribution to Jewish learning and growth for teens, college students, young adults look like? How similar do survey items need to be? How would the strategies, models and programs be documented/described to enable an understanding of survey results?

As the Consortium moves forward, the Foundation will look to share insights and important lessons learned with the field about how the Consortium members work together – and how their work is progressing. Since the Foundation’s inception, it has awarded almost $9 million towards evaluation of grants and initiatives. We hope to see tangible outcomes from the Consortium’s efforts that will leverage these dollars as effectively as possible, including:

  • A plan for researchers, funders and practitioners to agree on common constructs;
  • The development of a set of standardized questions that can be utilized across the Foundation’s portfolio of grantees;
  • Field testing of a “universal toolkit” for collecting data on common outcomes and demographics;
  • A plan for longitudinal testing, and recommending resources to disseminate and encourage the use of universal sets of tools.

This is an exciting moment of opportunity, bringing together numerous organizations and initiatives. We are in a better position than ever before, thanks to the Evaluators Consortium, to develop the right mechanisms and systems for doing this work. Too much is at stake to let the moment pass.

Members of the Evaluators Consortium include Professor Steven M. Cohen, Hebrew Union College-Jewish Institute of Religion; Ellen Irie, Principal, Informing Change; Yael Kidron, Ph.D, Principal Researcher, American Institutes for Research; Ezra Kopelowitz, Ph. D., Chief Executive Officer, Research Success Technologies; Alex Pomson, Ph. D., Director of Research Evaluation, Rosov Consulting; Wendy Rosov, Ph. D., Principal, Rosov Consulting; Mark Schneider, Ph. D Vice President, American Institutes for Research; Lee Shulman, Ph.D., President Emeritus, The Carnegie Foundation for the Advancement of Teaching; Charles E. Ducommun Professor of Education Emeritus, Stanford University

Sandy Edwards is Associate Director of the Jim Joseph Foundation. Stacie Cherner is a Program Officer of the Foundation.