By Joshua Donner, Alex Pomson and Ilan Wagner
Are program providers from Venus and evaluators from Mars? Sometimes it seems that way with educators focused on the quality of the experience, and evaluators on quantifying the outcomes produced. Even when reality differs from this caricature, the pressing responsibilities of each party can take form as two separate universes of concern; one focused on process, the other on product.
We offer the case of Onward Israel as an example of how such relationships can be genuinely productive, how what is learned through evaluation can inform a program’s development, and how educators and evaluators can work hand in hand, held together by a shared concern with data.
Onward Israel is an innovative partnership between The Jewish Agency for Israel’s Unit for Educational Experiences, the Beacon and Shapira Foundations, and a host of local, national and international Jewish organizations such as Federations and Hillels. Onward Israel was conceived to provide a new type of Israel experience that would significantly influence the personal and professional trajectories of Jewish young adults from North America, Europe, and the Southern Hemisphere. Living in Israel for six to ten weeks, the participants, mainly University students, spend the majority of their time in professional internships with Israeli businesses and organizations, or in other resumé building experiences. Their remaining time (about 20%) is spent participating in educational programs or using free time to travel and pursue interests and connections. The program’s “special sauce” – at least as hypothesized in its design phase – comes from the space made for an unmediated, authentic encounter with Israel, and with Israeli colleagues and peers.
Launched in 2012, the number of participants has doubled each year. Nearly 900 young adults from North America, Europe and the Southern Hemisphere participated in 2014.
Since the program’s launch, even before the first participants arrived in Israel, Onward Israel’s organizational architecture included a strong evaluative component. Over the program’s first three years, the equivalent of 10% of program development and management costs were devoted to evaluation. The significant investment in evaluation provided the first opportunity to fully apply The Jewish Agency’s outcomes evaluation matrix – measuring a program’s impact on participant affect, behavior and cognition regarding Israel, Jewish heritage, Jewish community and Jewish peoplehood. The matrix is a conceptual framework which, thanks to learnings from the Onward Israel experience, is now being applied to an increasing number of programs organized under the umbrella of the Agency, allowing for clarity of goals and consistency of indicators across various programs.
Evaluation work has been conducted by a team from Rosov Consulting, handily located both in the US (where most program “partners” are located) and in Israel (where all of the program providers do their work). The Rosov team entered the partnership with extensive experience in working with clients interested in nurturing data-informed program cultures.
At one level, the team’s remit was conventional: Provide evidence of the program’s impact on participants’ Jewish identity formation. The small group of entrepreneurial philanthropists who provided seed dollars for the program’s start-up phase – as well as many local communities who similarly allocated limited staff time and dollars to this grand experiment – sought and deserved a straightforward answer to this big question. (Fortunately, the answer was yes.)
If all of this sounds like conventional evaluation work, what is far from commonplace is how data from each cycle of evaluation have fed the progressive development of the program.
The evaluation team was tasked with exploring who Onward Israel participants are, and how variables such as motivation for participation, prior Jewish education, and previous time in Israel relate to identified changes. These understandings were used by Onward Israel management at The Jewish Agency, and shared with partner Hillels and Federations, to adjust marketing and program design and even to launch a new line of Onward Israel products.
The evaluation has repeatedly found that participants attribute heightened significance to self-initiated experiences, time spent at work placements, and interactions with Israeli peers. Based on this finding, program staff rebuilt the job description of counselors and facilitators. In addition to the traditional skills of group-building and conducting group sessions, Onward Israel now offers special training for staff to serve as guides and enablers of individual educational journeys.
Furthermore, the program’s emphasis on guidance and training for participants to “tell their Israel stories” to others after the program has been fueled by the clear evidence that confidence and readiness to interact with others regarding Israel is undoubtedly a significant impact of the program.
When the evaluation revealed that Jewish heritage related outcomes lagged behind outcomes directly concerned with Israel, it prompted program organizers to engage in (still ongoing) deliberation and experimentation in how to close this gap: for example, through the production of guidebooks to local area synagogues or through making available families with whom to spend some of Shabbat. These offerings are designed to help participants who are interested in deepening the heritage dimensions of their time in Israel, without compromising the experience – and therefore outcomes – for other participants.
The most recent cycle of evaluation has included a second post-program survey, this time 10 months after alumni returned from Israel. This study revealed greater changes in alumni attitudes and behaviors than immediately after their return from Israel, even while there has been limited alumni participation in formal post-program experiences. In particular, the 10-month study showed that while gains in knowledge happen right after the program, positive shifts in attitude and behavior are actually more evident at the 10 month juncture. Another important finding from the 10 month study is the critical importance of social networks as a means of post-program engagement, consisting of both peers and Israeli contacts. This insight has led to a new Jewish Agency initiative of post-program engagement based on social media and affinity groups.
In our first three years, we not only proved our initial hypothesis – that Onward Israel works – but we also used data to make the program better. Having achieved “proof of concept”, the team could have decided to reduce its investment in evaluation. Instead, The Onward Israel advisory committee, led by seed funders including the Shapira, Koret, and Steinhardt Foundations and other anonymous donors, have decided to invest more in evaluation.
Onward Israel has set an ambitious goal of growing to 5,000 participants by 2019. To achieve this growth, it is important to understand which design elements contribute to impact, so these features can be retained and enhanced as the program scales. Having answered whether the program works, our new three-year evaluation strategy is focused on better understanding how the program works. The continued investment in impact-related evaluation is critical to ensure that the program’s aims continue to be furthered as it grows numerically.
In this new venture, The Jewish Agency for Israel and Beacon/Shapira Foundations, together with Rosov Consulting, have established a data-informed, evaluation based culture, designed to ensure their investments go that much further. The case of Onward Israel suggests that evaluation is in fact worth the money.
Joshua Donner is Executive Director of the David S. and Karen A. Shapira Foundation.
Alex Pomson is Director of Research and Evaluation at Rosov Consulting.
Ilan Wagner is the Director of Onward Israel and the Unit for Educational Experiences at The Jewish Agency for Israel.