Paying the piper, calling the tune

It’s an old adage: who pays the piper, calls the tune. In other words, if it’s your money, you get to choose. In general, it might be a good rule, but there are times when it’s worth reconsidering. Towards the end of last year, PSJP published a paper based on conversations with 14 civil society organizations in a search for El Dorado – or rather that other seemingly mythical quest, measuring social change. In webinars on the topic, three questions were discussed:

  • How do you measure social change in your organization?
  • Is there a difference between what you want to measure and what your funders think should be measured?
  • Are there things you would ideally like to measure but can’t?

Not surprisingly, the discussions revealed that question 2 is often the key one – and the general answer is ‘yes’.

Inside and out

Donors’ expectations are what one participant called ‘ridiculously simplistic’. They want something that seems easy to measure (even if it’s misleading in terms of what the project is trying to do) because it is easy and it looks as though it will be comparable with other programmes, which helps them to evaluate their overall effect – do the communities they work with have better healthcare, more crops to sell, schools for their children to go to, and so on? Nothing wrong with those things, but this kind of measurement is too prone to treat people as units, components to be acted upon and to see what the external effects on them have been whereas social change is often to do with internal differences – what they believe, how the act, etc.

This is clearly illustrated by the PSJP webinars. A grantmaker supporting a mushroom-growing project with marginalized women in North East India is not so much interested in the number of mushrooms they produce as in ‘the change in attitudes and behaviour in the women and return on investment in terms of economic independence.’ What they are looking for is quality of life rather than units of production.

A long engagement, not a hasty marriage

It’s become a truism to say that social change is a complex, non-linear process – ‘a bumpy ride’, as one participant puts it – and it takes time. We shouldn’t be surprised – and we probably aren’t – that donors want the fast food rather than the slow-cook. And, of course, it’s not just donor requirements that pose a problem. Giving an account of the impalpable is difficult anyway. ‘How do we measure [dignity] and adapt our framework accordingly?’ wondered one participant.

Two examples from the webinars illustrate these considerations. One of the participant CSOs in Mozambique is working in communities which are ‘certainly patriarchal and mostly polygamous’. The project is both affecting and affected by this state of affairs in many and intricate ways: ‘what we realize is that every time we peel off a layer of the onion of our understanding of that social dynamic, there is another layer underneath. This just means that we are getting deeper in our understanding but it’s extremely challenging.’

In terms of length of involvement, one environmental and social justice grantmaker works on a 10 year timescale, surveying grantees at different points in that time so they can assess what difference their grant immediately makes and what happens, five or 10 years after. These periodic surveys ‘are beginning to tell a story about why grassroots initiatives succeed and why they don’t.’

Just because it’s hard….

…doesn’t mean it’s impossible or that it shouldn’t be done. It helps, of course, if you’re looking in the right place. Too often, as noted, donors are asking for the wrong things from the organizations’ point of view. It’s like asking them to measure an earthquake with a barometer.

Nor does it mean that quantitative indicators should be ignored. Most of those who took part in the webinars use a combination of quantitative indicators with qualitative ones in order to take a cross-bearing and so that one form of index can be checked against another. One INGO that works with marginalized communities in the Himalayas and in East Africa says ‘it is critical that we combine the two because we find that the quantitative elements are removed from the lived experience of the beneficiaries, so we need the human stories to give more detailed snapshots of how projects are really enhancing the well-being of beneficiaries.’

Stories

Among the more accessible of the qualitative means of cross-measurement, the importance of stories was highlighted by webinar participants, especially among grassroots and marginalized communities. ‘We need the human stories to give more detailed snapshots of how projects are really enhancing the well-being of beneficiaries,’ says one. ‘Stories in the words of the people that we support are also a very important aspect for us in measuring change. We invest in story telling where we ask communities to write or work with a local storyteller, journalist or photographer who creates very short stories.’ It’s a point worth emphasizing. Stories convey images rather than explanation and, especially to communities where the predominant modes of transmitting and receiving information aren’t based on the written word and abstract thought, they speak more clearly. More than this, they humanize ideas in a way that neither arguments, nor statistics do.

Quantitative approaches

As to the quantitative measures, a number of methods of social change measurement are outlined in the PSJP paper. These include outcome mapping, social return on investment (SROI), setting SMART goals (specific, measurable, achievable, time-bound) and a system devised by a number of Russian community foundations which assesses the work of community organizations on building assets and trust in local communities and strengthening their capacities and agency. Most of these get round the difficulty of making the intangible tangible by using indices of progress rather than attempting to measure progress itself. The Russian community foundation method, for example uses as indices for the development of assets, the following: the number of sources of funds of different types, the availability of different methods to the community foundation, the number of organizations and people contributing to the work of the community, different types of people contributing to the work of the community foundation.

And there are others, too, among them, methods which proceed by steps, which suggests a specious inevitability of result that hardened social change organizations might look at with suspicion. Measurement Resources’ website, for example, proposes ‘five steps to measure social change’ which ‘will portray to your funder that your program is a safe investment and will produce what they want.’ US and UK-based advisory service InFocus proposes seven steps to measuring social change.

None of these systems is flawless or can be imported without modification. Both the five- and seven-step cases, despite their deceptively simple logical set-up, beg a lot of questions.

For example, Step 3 of the InFocus model urges you to ‘Select/develop indicators that will identify what has taken place as a result of running your activities and to what extent.’ This is likely to be precisely what organizations are struggling with. In other words, while the steps methods make it look as though you can’t go wrong if you follow the recipe, finding the ingredients is likely to be hard. The SROI method works by assigning a monetary value to non-monetary phenomena – but who is to decide how much behavioural change £100, $100 or 100 rupees buy should? The Sopact website admits that assessing the indicators and outcomes necessary to determine what SROI proponents often call the social impact ratio is ‘a daunting mapping task’. (It’s also tempting to think that sometimes criteria are introduced simply for the sake of making an acronym).

The point here is that organizations (and funders) need to be flexible in the approach to measurement they use and it’s one that is made in the PSJP paper. Apart from the intrinsic difficulties of measuring social change, the situation in which organizations are working is often unstable, so you can’t impose an inflexible linear form of measurement. A group of donors working in conflict situations acknowledges the need, ‘not to get ourselves tied down into a very rigid measurement of change because the macro politics in our situation change at such a rate.’ And again, ‘We don’t always know at the beginning what we want to measure so having some flexibility and not being stuck is important.’

There are other points that stand out. The central one for the purposes of this post is this: webinar participants stress the importance of asking communities what changes they want to see and what’s significant for them. A corollary of this is making sure you aren’t just listening to the more vocal and articulate, even within marginalized groups (‘We make sure that the voices we hear are not just of the leader activist,’ says one participant). And, as a blog on the Etherga Social Change network argues, any measurement has to be usable by and understandable to those being measured.

The key relationship?

 To those being measured….these things may seem self-evident, yet they aren’t always done. It’s the reappearance of the old bogey – the nexus between money and authority. As with any number of other issues in philanthropy, the main head-to-head in the measurement question tends to be that between funder and implementing organization, yet the arena where this plays out is the affected community. The two relationships, funder and grantee, grantee and community, are being treated as though they are somehow equivalent. If the funder is happy with the results, then the community will be too…..won’ it?

Those at the sharp end of a problem know best how to deal with it. At the very least, they can tell when it’s being dealt with effectively. This is not news. Almost everybody in philanthropy now says so, but if change is to happen where it is needed, funders have to start acting on this maxim, too. It’s no accident that one of the webinar participants’ principal recommendation is donor education. Even if you’re paying the piper, there are times when others should call the tune.

 

Andrew Milner is a freelance writer, researcher and editor specializing in the areas of philanthropy and civil society. He is a consultant to Philanthropy for Social Justice and Peace. He is also a regular contributor to, and associate editor of, Alliance magazine.