How much data and monitoring of water services is too much? How can we ensure data collection and monitoring efforts bring about improved results and don’t end up absorbing resources otherwise put to use to imrpove services? This week’s story explores these questions. It was originally published by Engineering for Change. To read the original article, click here.
By Susan Davis
“A little less conversation, a little more action, please” – Satisfy Me, Elvis Presley
In recent years there has been a greatly increased focus by governments and development partners on monitoring of water services, with several associated tools, indicator frameworks, and platforms. But we may be spoiling a good thing with too much attention. (As I was working on this article, Stef Smits of IRC published one with a very similar theme, “You cannot manage what you do not measure; but should you measure what you cannot manage?” His focus is on governments. I’m focused on development partners and their funders here.)
More and more monitoring
A few questions arose (again) during these recent WASH monitoring webinars or events:
An orientation of RapidWASH for monitoring impact in rural WASH service delivery at WASH Futures
The questioners were getting at a problem that is arising in global development projects. A large part of the issue can be summed up in this analysis of the costs of monitoring, as outlined in the white paper Effective Philanthropy, by the non-profit organization Giving Evidence.
“Monitoring and evaluating, and reporting to donors is a significant cost: in the UK, about £1bn annually, i.e., 2.7% of charitable sector income, and about 2% in the US. There is evidence that much of it is too low quality to guide good decisions, and that neither charities nor funders find it terribly helpful. So either the quality should be improved, or this work should cease: that £1bn – and its equivalents in other countries – could fund a lot of trachoma operations or diarrhoea prevention. Furthermore, the purpose of this tracking is to influence behaviour, and it is known that some ways of presenting and receiving information are better at achieving this than others. It is not clear that charities’ tracking research is being best used to that end.”
Who is the monitoring data for?
Often development partners collect data to meet donor requirements. Perhaps because we don’t really know what works over time, we try to make up for it by collecting as much data as we can, snapshot style, in hopes that the answer will magically appear. Last week, I started auditing Acumen’s “Lean Data Approaches to Measure Social Impact.” In the articles on the organization’s site, acumenideas.com, Tom Adams and Jer Thorp share an anecdote about a farmer who had been asked the same questions – often personal – by different people with clipboards several times over the past few years. He never saw any of the reports or conference presentations that were generated, and nothing was done about any of the problems he might have described.
“Put yourself in that farmer’s shoes. If someone turned up to your place of work asking for several hours of your time to answer a string of questions, many of which were personal, how would you feel? … would you feel valued?”
I’m guilty of this. I’ve held that clipboard and asked those personal questions. “Do you use your toilet?” “Do you wash your hands after?” I’ve created tables and charts and reports with that information. But I don’t think I’ve ever tried that hard to make sure the results got back to those people I interviewed, much less make sure that something constructive is done based on the data.
What will be done with monitoring data?
Maybe there are fewer webinars on how the data are being used to improve services, but it seems that lot of time, innovation, and funding is going into apps and gadgets and platforms to monitor water services, and not as much to support systems to fix the problems that are found. Improve International and partners looked into this issue, which we called resolution, a few years ago. The main feedback we got from development organizations was that they just couldn’t get funding for that, and/or that it wasn’t their responsibility. This debate is addressed by the Agenda for Change, which shifts attention from building infrastructure to supporting the systems that keep the services going.
A few months ago, I was visiting rural health facilities where, in most cases, WASH conditions had deteriorated after previous interventions. I was thrilled when the implementing organization came back to me with a list of their planned actions for current and future programs based on the findings and recommendations. This is, unfortunately, rare in my experience. Many organizations do not have the funds, time, or mandate to actually use the data they are collecting.
From measuring to managing: the Lean Data approach
Acumen asks their clients to predict results before they perform surveys. In one example, A social enterprise thought that 40 percent of their customers were earning less than (USD) $5 per day, but surveys showed that only 20 percent were. What are we assuming in WASH?
One common WASH assumption is that people will of course use the improved water source provided by an intervention in lieu of other sources. In a post-implementation evaluation that I lead in Zimbabwe, the assumption was that people would be happily using the new or rehabilitated water sources provided by the development partners. What we found was that people were using several sources – including hand dug wells, harvested rainwater, boreholes, bottled water, and the piped water in their homes, depending on what was available on that day (none of them were reliable 24/7) and most convenient. Their perceptions of which of those water sources was safest for drinking varied as well. What was done in response to this information? Honestly, I don’t know.
Another major assumption across previous decades is that protected groundwater sources are generally safe. The NGO Water Missions found that over half of the groundwater samples taken from drilled “protected” boreholes have tested positive for fecal contamination. Their response to these data? Making sure all water sources are treated before consumption.
Less data, more action
“It’s all too easy to forget that data is about human beings and their behaviors,” Adams and Thorp write on Acumen’s site. “Who has ever jumped out of bed, punched the sky and cried out ‘Hooray, today I’m being monitored!’ No one.”
Water use is tied up in complex ways with people’s mindsets, body strength, wallets, jobs, hobbies, noses, and taste buds. A Lean Data principle is to collect only data you can or will act on. More data doesn’t equal more impact. Lean Data surveys are often by phone or SMS text and consist of 5—7 questions. Mobile tools and technology can be part of this, but they are not what should drive the process. This general question could tell you a lot about effectiveness and impact of your WASH program: “Which changes have been most important to you?”
Just a reminder to development partners and donors: Don’t forget why you’re monitoring. Make sure funds, time, skills and commitment (your organization’s or others) are in place for acting upon those data. This can take the form of resolving problems directly, strengthening local organizations to address challenges or build on strengths, adapting future programming or funding principles, among other activities. On the bright side, if the water services meet standards and everything is awesome, share that information broadly – and how you got those results – so that others can replicate. If you’re not going to act on the data, or someone else cannot act on the data, why are you collecting it?
Susan Davis is founder and Executive Director of Improve International.