Skip navigation

From analysis to action: operationalising learning and adaptation in Savings at the Frontier

Banner image

A Savings at the Frontier blog post

Despite the 'emerging international consensus' that adaptive programming approaches are effective in addressing complex development challenges, a recent joint report by two DFID-funded programs (LASER and SAVI) notes that there remains a "recognised lack of practical case study material" on adaptive methods.

That is partly due to difficulties balancing competing needs for flexibility and accountability. Measuring performance based on adherence to an adaptive implementation process swims against the conventional tide of input, or even output-based accountability frameworks, and requires reconciling with the equally prominent results agenda.

Where does Savings at the Frontier stand?

Through the Savings at the Frontier (SatF) program, we seek to harness a shared commitment with our partner, The MasterCard Foundation and our other key stakeholders, to learn and adapt to processes that would best link informal savings mechanisms (ISMs) with formal financial institutions. In each of our countries of focus, Zambia, Ghana, and Tanzania, financial inclusion poses a sufficiently complex set of challenges, for which learning and adaptation are more appropriate paradigms than results alone.

That doesn't mean we aren't committed to delivering results (see this 2015 FSD Africa Consultation Document for how Impact Oriented Measurement actually means refocusing the lens away from results per se); it just means our path to results is problem-driven, rather than solution-driven. Our model is based on an iterative research process, underpinned by a strong investment in monitoring, research, evaluation, and learning (MREL): a set of small-scale business models will be tested and thoroughly analysed, the learning captured, and a new set of models designed and tested from that point onward. This learning process will in part be tracked against our ongoing effort to answer (and re-answer) a set of four 'learning questions' (LQs), including for example, 'How do ISM client segments differ from each other?' and 'How do the financially excluded/underserved respond to linkage opportunities?' As our evidence base builds throughout the program, our responses to these LQs will delve deeper, capturing our progression in understanding over time.

SatF training for demand side research, Kumasi, Ghana. 

Making learning processes explicit

Perhaps this iterative analytical process will inevitably result in both learning and adaptation, but the SatF project team is cautious about making any such assumptions. According to Sukhwinder Arora, the SatF team leader, analysis is a relatively safe territory, but the harder task is to move from analysis to action. SatF isn't there just to provide new insights. We actually need to demonstrate that our learning works in practice: that it's both useful and used.

But how can you tell if your learning is resulting in action? Robert Stone, the SatF project director, argues that it's about making our learning processes explicit. We could develop a great monitoring framework, and draw well informed conclusions based on our evidence, but we need to go further. We need to demonstrate the ongoing application of those discoveries , not only how our answers to the LQs are changing, or deepening, over time, but how this is influencing our programming.

SatF's aim is to go beyond monitoring and evaluating delivery; we plan to monitor our own decision processes. Are we acting on our information? Can we show we're adapting based on evidence? By recording that process , the process of how we draw on our monitoring information to inform our decisions , we go from M&E to MREL; from analysis to action. For example, one monitoring tool we're using, called the 'LQ tracker', maps our emerging understanding of the LQs against two decision points: first, where we target our next research activities, and second, what new interventions we select. In other words, our monitoring tools force us to demonstrate causal links between our new insights (analysis) and our program decisions (action).

Redesigning the management process

Our commitment to demonstrating adaptation is even changing the way we conduct our team meetings. Saltanat Rasulova, the MREL manager for SatF, has started an initiative to redesign our project management process, to turn even SatF's regular project meetings into platforms for rehearsing and recording an adaptive learning process. In this process, we're identifying some important adaptations, for example from our first country assessment in Ghana to the subsequent assessments in Tanzania and Zambia, where we moved from simultaneous to consecutive supply and demand side team visits, having recognised the opportunity for one team (whether supply or demand) to build on learning from the other.

This is a course correction we'd have made intuitively, but in SatF's case, we're taking efforts to identify, justify and record any such changes, to train ourselves to make our adaptive thought processes explicit and organised.

And we'll also try to drive this same behavior in the partners we work with: more than any other performance measurement, we're interested in assessing how our partners and ourselves are acting upon the learning we generate. We'll look for feedback not just on the success of test models, but on whether our monitoring and research processes are influencing the development of better models over time.

It will be a complicated and reflexive process, and it will inevitably get derailed at times. Sometimes acting on new information is a challenge: you have to convince others of the need to change course; your partners or you yourself might be locked into decisions that have been already taken; or the space for adaptation might close up. Nonetheless, by ingraining and documenting our adaptive learning process in an organised way, we also hope to contribute to LASER/SAVI report's call for examples of adaptive programming in practice, or at least tell a story of how challenging it really is!

Yusef Salehi is the organisational learning lead for OPM, focusing on problem-driven iterative approaches to organisational development. His background is in financial tax law and policy, and has worked in project and team management roles across the development sector for the past four years.

This article originally appeared on the Seep Network website.