By Adam Naftalin-Kelman
and Beth Cousens
So, you run a program, and your colleagues ask you the next morning, How did it go?
What do you answer?
We wager that most of us give the following answers, in the following order:
- We got more people than we thought we would.
- The energy in the room felt good.
- The food worked well.
- The speaker did what we asked.
- No one complained! Not even Mr and Mrs Kvetcher, who always have something to say.
Maybe, maybe, at some point, we say, “And I think the people who came really got something out of it.”
Each of those answers is an important answer. It matters how many people were there, and how the room felt, and if the environment was good and the participants happy. Ultimately, though, that last point – at least, in Jewish education – matters most: What was the program’s influence on participants? Did we do what we set out to do? Did we change anyone’s Jewishness – did anyone grow Jewishly because of us?
Unfortunately, that last point is also the hardest to measure. Establishing what we set out to do in formal Jewish education settings is often complex, and evaluating it can be slippery as we try to develop measures for what seems highly personal. Adding the variable of informal Jewish settings, with its socio-emotional or other affective agenda, only adds even more complexity to this problem. Still, in an increasingly demanding philanthropic marketplace, with board members, foundations and supporters caring deeply about the impact of their investment, it is our responsibility to show the value of their investment. We need to move beyond our ‘feelings,’ anecdotal assessments or purely numerical accounts of people in chairs. We need to be able to say with authority, integrity, and even some degree of empirical certainty that we are doing great work.
We also have a responsibility to ourselves and to those who participate in our experiences – in our case, students – to understand whether or not what we’re doing is working. We move so quickly during a program year, continually generating and implementing new ideas, pushing stuff out, often without more than just a few minutes to reflect. We collect intuitive and anecdotal evidence – we might leave a conversation with a student, for example, and feel that we nailed it. But comprehensively, scientifically, are we really making a difference? And what is it that we do that does make a difference?
Impact evaluation is not foreign to Jewish education. Philanthropic organizations – namely, the Jim Joseph Foundation, Schusterman Foundation, Covenant Foundation, and others – have been exploring the influence of their programs on participants (among other questions) for many years. Hillel as a movement is beginning to do this work with its Measuring Excellence project. Still relatively unique, though, is an organization taking on the responsibility itself of asking these questions, and not just about one program but all of its programs, all of its annual efforts. Driven by these reasons, by our own need for data and out of responsibility to supporters and participants, Berkeley Hillel took on such a project.
Working together, as Hillel’s Executive Director (Adam) and a consultant with expertise in this area (Beth), and with lay leaders and students, we launched a three-fold program of measurement, including:
- An annual survey of students, intended for Jewish undergraduate and graduate students but that would collect some data from all students;
- A series of focus groups, which would impanel a group of first-year students each year, asking them each year to return for a focus group, no matter their Jewish involvement, allowing us to begin a longitudinal study of their Jewish experience through their UC Berkeley years (so that by the fourth year, we would have four series of focus groups),
- Registration of students at as many programs as possible, “tracking” students’ participation.
The survey and focus group protocols were based on Berkeley Hillel’s organizational theory of change, which unites everything that Berkeley Hillel tries to do, all of its strategic efforts and programs. Together, these three methods constitute a comprehensive program of measurement and evaluation, allowing Berkeley Hillel to understand the totality of the student experience (through the survey), the texture of the student experience (through the focus groups), and student participation trends (through student registration). These methods allow for an understanding of influence and also for an understanding of the programs and efforts that create that influence.
What have we learned through this measurement project? Not about Berkeley Hillel’s influence (although for a report of the data, feel free to email us!), but about the project of measurement?
- Response rate means (almost) everything. We can have the most thorough survey, the strongest focus group questions, but without a tremendous effort to find Jewish and non-Jewish student respondents, we cannot analyze the data in deep and interesting ways. Our survey has had a decent response rate, and we even have had respondents with and without involvement in Hillel and prior Jewish experiences. In addition, while we can do a great deal of analysis with the entire dataset, we often have small sub-populations whom we want to study (such as in-state students or participants in Alternative Breaks) and the response rate is too low to isolate significant size populations in each of these areas. This prevents more complex learning about these various populations.
- This effort benefits from collaboration. This seems like it would be a staff-driven effort, with deep subject matter expertise from outside of Hillel and leadership from the staff whose work is impacted the most. But Hillel’s volunteer leaders and students have questions about Hillel that they want to pursue and intuition and experience regarding how to ask these questions. And, the more perspectives involved in looking at the data, making connections, asking questions, the more rich the analysis can be. We also worked with subject matter experts on campus, professors with decades of sophisticated knowledge in studying psychological attitudes and in polling, and we consulted with the university’s expert in student data. (We borrowed some questions from the UC Berkeley student survey, wanting to know the answers for Jewish students and using them to understand the strength of our sample.)
- The work – not just of measurement, but the work itself – is complex. We have produced empirically-based “answers” for supporters and we have learned a great deal for ourselves, about the success of Hillel’s work and about where Hillel can be more effective. At the same time, the focus groups make particularly evident the many twists and turns the college experience holds.
There is no certainty during the college years. College marks a significant time of personal exploration and self-determination. For many students these years mark the first time away from home, the first time making personal choices about who they are and what defines them. Developmental psychologist, Jeffrey Arnett writes, “Sweeping demographic shifts have taken place over the past half century that have made the late teens and early twenties not simply a brief period of transition into adult roles but a distinct period of the life course, characterized by change and exploration of possible life directions.”
That is all the more reason why this project is important: Because it reveals to us how critical the college years are on individuals’ Jewish journey, and thus we need to truly measure the effectiveness of our work.
”Emerging Adulthood: The Winding Road From the Late Teens Though the Twenties” (Oxford University Press, 2004).
Adam Naftalin-Kelman is Executive Director of Berkeley Hillel and Beth Cousens is Principal of Beth Cousens Consulting.