Top 5 Learnings from CLP. Blog Written By: Mat Pritchard, CLP Team Leader
1. The Sustainable Livelihoods Approach Works! (But…)
All the evidence collected by CLP shows that CLP’s Sustainable Livelihoods approach has worked. Graduation data – tracking a range of vulnerability metrics to illustrate the movement out of extreme poverty – shows that 90% of CLP’s participants have ‘graduated’; i.e. they are on the right journey out of extreme poverty.
Looking at individual metrics such as income, expenditure, ownership of productive assets also shows that CLP has succeeded:
Average Income per month (Tk)
Average Expenditure per month (Tk)
Average Total Value of Productive Assets (Tk)
There is a wealth of other data that shows similar improvements in other areas: access to improved water; reduced vulnerability to floods; ability to eat three meals a day and have a relatively balanced diet; and so on.
The ‘But…’ comes in, however, when one looks at the range of results rather than the averages. On the productive assets indicator, for example, we know that there are some ‘soar away’ households that are doing brilliantly. So well, in fact, that they’re dragging the overall average values upwards. In October 2014, for example, 28% of households from Cohort 2.2 (an early group that received assistance from 2010 to 2012) had asset values of Tk. 30,000 or less. However, over 46% of Cohort 2.2 had values of Tk. 70,000 or above. We investigated some of the causes of success and challenges in productive assets: click here for the report.
If you look at the graduation figure, while 90% graduation might be considered ‘good’, it isn’t 100%. We know from research that some households do slip economically; some go right back down to the kind of level they were at when they joined CLP (i.e. back to being ‘extreme-poor’).
The question is: what can or should be done about it? Should we just accept that some households might benefit more than others; and that some households will slip back? If not, what’s the answer?
This is a difficult question with many possible responses, but my learning over three years at CLP is that there could be a partial answer: using a markets-based approach (such as M4P – Making Markets Work for the Poor) as well as the sustainable livelihoods approach.
This isn’t a magic bullet – regrettably, in development work there are almost never any magic bullets. However, CLP’s markets-related projects have seen steady improvements in productivity and profits from livestock in the meat and milk sectors.
While much of this learning was available to ‘normal’ CLP participants through our core assistance capacity-building, the markets components themselves were limited in geographic scope and scale. Just under 5,700 producers took part, of which 34% were previously CLP participants.
There’s no problem piloting approaches, of course – in fact, it’s a very useful way to ‘fail quick and learn quicker.’ However, it’s fairly clear that the ability to raise households from extreme poverty exists, but keeping them moving out of poverty – and giving them strategies to recover if they do fall back into poverty – requires additional approaches.
And this is where the markets components come in. CLP’s market projects focused on facilitating market linkages, encouraging market players to come together and recognise the potential for business, and also put in place institutions (Char Business Centres) to help ‘anchor’ these linkages at the local level.
These approaches could easily have been rolled out into the ‘core’ package of CLP assistance and delivered additional benefits, helping to give small-scale agricultural entrepreneurs additional help and pathways to a better living. Taking a small example, the network of Chars Inputs Dealers that the markets components facilitated could have relatively easily been replicated in all CLP ‘core’ areas. Not only would this have made accessing inputs such as concentrate feed, fodder and medicines easier, it could have created additional livelihoods opportunities.
This would, of course, have come at a cost. And it’s too simplistic to say that two approaches like M4P and Sustainable Livelihoods can just be merged without serious consideration and thoughtful design approaches.
Nevertheless, CLP’s overlapping livelihoods and markets results seem to indicate the potential for real returns to be generated from this combination.
2. Comprehensive Support Is Best… Or Is It?
There’s no doubt that CLP’s support has been comprehensive. The Programme had 18 separate assistance sub-projects (actually 19, before one project was integrated into two others) covering a variety of topics from livelihoods to infrastructure and social development. These are separate from other components such as monitoring and evaluation (more on that topic later!), Partnerships, emergency assistance and so on.
So in the space of a month, a CLP participant might learn about topics such as: diseases of poultry; the ‘right’ amount of concentrate feed to give to a lactating cow; the law relating to dowry and child marriages; how best to feed a 6-month old child; and the importance of washing hands before and after everyday activities. Along the way they’ll also hear about how to cultivate red amaranth (a very nutritious leafy green – although it’s red); how to plan for and respond to a flood; and why regular cattle vaccinations are a good thing.
How do we know that this comprehensive support delivers results in these areas? The research that we’ve conducted shows that many of the areas that we’ve delivered training and other support have shown improvements, or participants have reported improvements. For example, this study on disaster preparedness shows that CLP participants increase their ability to manage disasters; while our headline outcomes report shows a number of areas where participants are doing better after CLP support.
Much of this assistance can be seen to be supportive of each other. Plinths, for example, help people stay above the flood line during the rainy season, when previously they might have had to move or simply live with water in their house for weeks. However, the plinth also houses the latrine, improved water supply, their income-generating cattle, thus aiding in more than just vulnerability to disasters.
While the training on livestock management helped producers protect their cattle and increase productivity, an overall effect of the training and social development capacity-building in general was to increase the confidence and, most likely, the overall empowerment of women.This early CLP summary reports that a husband was overheard saying that CLP has helped participating women ‘become clever’ – showing that this training does have an impact on women’s standing in the household and community.
So while CLP believes that the comprehensive support has been important, and has plenty of data to support many aspects of it, the Programme has never researched exactly how this overlapping support works.
If the budget had been cut by 75%, what elements would the Programme team have cut or modified first? Hard choices would have been needed. Would some of the ‘supporting projects’ like poultry rearing and homestead gardening survived? We know they were useful – they contributed to additional income and increased food security. But were they returning enough benefit?
The social development curriculum was comprehensive – 47 modules! – but could it have been cut down? All the elements were carefully thought through and thoroughly justifiable – but what were the absolutely essential bits if the team had needed to cut it by 25%?
Could CLP have achieved very similar results – say 95% of the same impact – while spending only 75% of the budget? This question is hard to answer because these internal ‘returns on investment’ and the importance of the overall elements of the package in supporting each other is not known.
What impact would a more comprehensive approach to markets using the M4P approach have had? Would the additional funding required have generated a good return in terms of additional income and reductions in vulnerability, over and above the livelihoods approach?
All of these questions are hard to answer definitively; and it must be recognised that trying to investigate this kind of complexity would, in itself, have been a difficult and no doubt expensive proposition. Taking a hard-headed approach, would the research have paid for itself in terms of ability to be translated into management action, savings and possibly increased returns for participants?
Nothing in the Programme was there without having been extensively debated and justified. Because CLP was a ‘mature’ Programme – implemented for 12 years in two phases – it had plenty of time to test and modify approaches. So it’s very possible that changing the ingredients of CLP could have led to less impact, rather than more: 95% of the impact with 75% of the budget would have been a great result; but 50% of the impact for 75% of the budget… not so much.
3. Two Years of Support Is Good For Many Things… But Not All
CLP’s support was programmed in two-year chunks. At the start of the two years, a cohort of participants was identified, verified, and then contracts drawn up with our Implementing Organisations (see below) to provide the ‘core package’ of assistance. This process took between three and four months in total, with the result that the technical assistance package was, on average, between 18 and 21 months.
For any given level of budget, any programme will have to make choices between depth (level of support), breadth (the amount of different kinds of support) and timescale over which support is provided.
As shown above, CLP opted to provide relatively generous support across a wide variety of activities, but only for 18-21 months.
The numbers above and our graduation figures indicate that this level and timescale of support is enough to get households on the right track out of poverty; it possibly gets them over a ‘poverty tipping point’ so they have a greater chance of leaving extreme poverty behind.
However, CLP recognised during its implementation that some aspects of assistance required more time. This may partly explain why only 19% of households met CLP’s (self-imposed) target of having Tk. 3,000 in savings. It also appeared to hold true for assets, where less than 60% of households had productive assets of more than Tk. 30,000.
But there were other areas where more time may have been an important factor. Under CLP’s Direct Nutrition Project, for example, figures showed that, while some nutrition indicators got better (women’s BMI; rate of stunting), others didn’t (haemoglobin status of women, rate of wasting). It’s also possible that some indicators in CLP’s markets projects would have shown better results with additional time, given that even the longest component had only just over three years of implementation.
Institutionalising social development changes, such as efforts to eradicate dowry and early marriage; as well as projects such as the Khas Land project (helping families apply for government-owned land that can be distributed to poor families) were also areas identified as possibly needing more time for deeper impacts.
So while CLP’s assistance was definitely impactful, future extreme poverty programming could certainly examine the link between period of assistance, impact and cost in more depth.
For more information on various social protection issues, please see this Lessons Learnt Brief on CLP’s contribution to social protection debates.
4. The Importance of How You Know What You (Think You) Know
It may sound trite or a ‘motherhood statement’, but projects and programmes need good monitoring and evaluation systems and resources. Proper M&E is pretty much the only way that project implementors (and evaluators) know what is happening, what is working, what could work better, and the directions that the project should go in future. So it’s surprising, in my experience, that many projects don’t have good M&E systems.
CLP, thankfully, did have a good M&E system. We still learned many useful lessons, such as:
- to avoid conflicts of interest and conscious or unconscious manipulation of data, collect evaluation data using non-project enumerators; i.e. do not have frontline service delivery staff collecting data which will be used to assess either their performance or the performance of the activities to which they contribute;
- accept that regular operational monitoring data may suffer from various flaws (biases, mis-remembering, mis-reporting etc) which, while OK from an operational perspective (approximately right is better than exactly wrong), is not good enough to judge outcomes and impact;
- run various kinds of verification activities, such as checks on outputs, customer satisfaction surveys and spot-checks to reduce various problems, from data collection issues to misappropriation or fraud;
- it is probably better to thoroughly analyse a more limited data set than it is to do ‘shallow’ analyses of much-larger datasets, so make sure you have an appropriate amount of analytical capacity, rather than solely emphasising data collection;
- what’s ‘nice to know’ shouldn’t supersede or displace what is ‘critical to know.’
I could go on because M&E is a topic close to my heart, but will finish with this cautionary tale. During a meeting, one of my markets staff delightedly told me that average milk production had more than doubled over the baseline – from 1.3 to 2.75 litres per cow. I was duly impressed and may even have repeated this figure.
However, when the real outcomes study was published, it showed that this optimistic operational stat simply wasn’t supported by independent data collection and analysis. While there had been some impressive gains, they weren’t quite as stellar as this colleague had reported: closer to a 25% increase than a 100% increase.
There could be all sorts of reasons why this colleague had such a rosy view. Maybe the data they had been reviewing wasn’t entirely statistically representative; maybe they had neglected to mention that it wasn’t actually the average, overall data, but from a particular sub-set of a particular period of data collection. And maybe, because they were reporting on work that they were passionate about and that they really WANTED to be amazingly positive, they misinterpreted or succumbed to one of the many unconscious biases that can skew people’s views.
This happened relatively recently, after many iterations of data collection and reminders to the team that the independent market outcomes data was THE dataset to quote. This is proof positive that cool, calm, well-collected and well-analysed data is in the only answer, and the only real way to be (fairly) sure that you DO know what you think you know.
5. Local Implementing Partners
CLP was implemented on the ground through a network of local Implementing Organisations or IMOs (because no-one ever passes up the chance to use an acronym!).
This mode of working was very successful. The local NGOs often had deep roots in the communities in and around the chars, meaning that they were able to operate effectively very quickly. They were able to bring local ‘prestige’ and authority due to their recognised community-based work. Also, their familiarity with implementing donor-financed programmes ensured that they were familiar with the general principles of operations in a wide variety of areas, both technical (e.g. livelihoods, community development, infrastructure) and managerial (procedures, financial systems, zero tolerance to corruption policies, and so on).
One lesson that CLP learned early on was that, partly due to its procurement and contracting approach, it was treating its IMOs much more like hands-off contractors than as fully-fledged partners that could bring useful learnings. While the contracting approach had benefits, nevertheless it was recognised that a more open and communicative partnership was required. CLP made changes to its operating structure and procedures, for example instituting quarterly feedback and learning meetings, to facilitate this way of working. Liaison between District offices and IMOs was strengthened and deepened and became a very useful way of maintaining oversight as well as encouraging constructive and learning-based interactions.
The contract-based relationship was a useful framework to ensure clarity. Outputs, outcomes and expected standards and procedures were clearly laid out in these contracts, and duly enforced (as documented in this Lessons Learned brief on Corporate Culture). As an example, the ‘zero tolerance to corruption’ approach was fully documented and expectations set in the contracts. Through good monitoring systems and verification processes, any examples of potential corruption were fully investigated. Where it was proven that some kind of fraud had taken place, the contractual requirement to respond was absolutely clear.
Part of the approach, however, was to ensure that IMOs had the standard of skills required to operate satisfactorily. Defining the standards is one thing; ensuring that they can be reached is another. So the IMO finance staff, for example, were extensively trained when they joined and provided with refresher training. Standardised accounting software was installed in all IMOs and staff trained in its use to ensure consistent and high-quality financial reporting. Finance and procurement manuals were provided to ensure that all procedures were clear.
The 17 IMOs and two Special Service Providers that CLP has worked with have been an indispensable part of the CLP team. Without them, it is no exaggeration to say that CLP may not have been possible: huge thanks to all of them!
 Average values from all Cohorts, October 2015 Annual Survey data
90,063 total views, 10 views today