by Amy Braier (nee Philip)
Like many foundations, we wrestle on an ongoing basis with issues of evaluation. The Holy Grail of evaluation seems to be evidence: we want evidence that the work we are funding is having a positive impact, and evidence that we as funders are making a difference and the Jewish community is no exception.
However, in order to obtain this evidence, we and the organisations we fund are increasingly required to interpret, and sometimes commission, quite complex research. For funders who do not dedicate a staff post to impact and evaluation, and for those of us who do not have a background in social sciences or research methods, this can be a daunting prospect.
Do we really know what we are looking for? Do we know if the methods used to gather it were appropriate? How do we interpret it? And can we tell the difference between good and bad evidence?
We also need to consider the relative value and quality of different types of evidence, for example academic research versus research carried out by think tanks or independent consultants.
Finally, we should ask ourselves whether we are looking for genuine evidence of impact, which may or may not exist or even tell us what we want to hear, or are we looking for what Dennis Collins describes in The Art of Philanthropy as ‘proof that things worked out the way we intended’? Do we fully understand what kind of research will produce the former, and what kind will lead to the latter?
A few years ago we worked in partnership with a grantee to commission an impact analysis of their work. The tender was won by a British university. At the end of the project, we were presented with a 216 page report with recommendations, 120 pages of statistics and a volume of appendices. It took the grantee many months to go through such a large data set, pull out what was important to them and communicate it to their various stakeholders.
The volume and complexity of the data raised important questions for me too: How much time should I as the grant manager devote to getting to grips with the report? To what extent should I rely on the grantee’s presentation of the evidence? And how confident did I feel to interpret the data myself?
We sent a copy of the research to another academic expert who challenged our researchers’ conclusions, and again I had to question whether I felt equipped to look critically at the evidence and decide between contradictory interpretations.
The project was still a great success. The charity was better able to articulate the benefits of their work, it strengthened their reputation and helped embed evaluation within the organisational culture. However, this experience illustrates the some of difficulties we face when it comes to analysing evidence.
So here are my three recommendations for how we can become more ‘evidence literate’.
- Create more opportunities for professional development
I have learned through experience but there should be more opportunities for professional development that focuses on the skills required for non-researchers to commission research, interpret and critically analyse evidence and use it to inform our philanthropic decisions.
- Make research more accessible
We should insist on good dissemination strategies as part of any research brief and we should ensure that existing research is communicated to funders and grantees in an accessible way that makes it easy for them to understand what it is telling them and apply it in order to improve their practices and impact.
- Encourage dialogue between funders, academics and charities
We must do more to bridge the gap between research and practice. Funders and charities should create more opportunities to get together to present evidence and learn from existing research in shared areas of interest. This can be difficult to achieve in a context where charities are competing for resources, but evidence should be about improving outcomes for a whole sector and we as funders are in the perfect position to convene such conversations.
By equipping ourselves with the skills to commission robust evidence, critically examine the data presented to us and ensure that it is fed into practice on the ground, we will be able both to advance the causes we are funding and contribute more confidently and effectively to the impact debate.
Amy Braier (nee Philip) is Deputy Director of Pears Foundation.