What does it take to be a successful third party monitor?

How to increase the speed and usefulness of programme evaluations – we look at the ‘dos’ and don’ts’ of third party monitors.

Authors

As purse strings tighten and scrutiny increases, donors are under increasing pressure to demonstrate impact and value for money. To assess the impact of the programmes they fund, donors have traditionally turned to quasi-experimental impact evaluations. However, there are certain limitations of such impact evaluations. First, by the time the evaluation results are known, the programmes are finished, leaving no scope for course correction. Second, it is quite difficult to attribute impact to programmes because of many contextual factors. Finally, these evaluations are expensive.

Engaging a third party monitor (TPM) is an alternative approach that donors such as FCDO are increasingly turning to as an alternative to (or supplement to) traditional impact evaluations. By providing ongoing monitoring and verification of programme results, TPMs can assist implementers in identifying ways to adapt and achieve greater impact. This, in turn, gives donors greater confidence in results and visibly demonstrates their commitment to adaptive programming, building the case for future funding. Although TPM is recently gaining popularity, it’s not an entirely novel concept. For example, UNICEF’s real time assessment (RTA) approach also follow very similar principles.

The role of a TPM is defined on a programme-by-programme basis: no two are exactly alike. Some serve to independently verify results for sole party contracts, or across multiple lots. Some serve to conduct independent research about the programme. Other TPMs provide technical guidance to clients/donors and implementers about the design and set-up of Monitoring and Evaluation (M&E) systems. Often, TPMs serve a combination of roles.

We are the third party monitor for the Women’s Integrated Sexual Health (WISH) programme, a major FCDO programme promoting equitable access to family planning and the sexual and reproductive health and rights of women and girls in 27 countries across sub-Saharan Africa and Asia.

Along with Itad, our partner on the programme, we’re responsible for fulfilling three objectives: verification of results, generation of additional evidence, and supporting learning. As such, the WISH TPM has both an ‘auditing’ function and a ‘technical advisory’ function. As with many other TPMs, successfully fulfilling these roles requires flexibility, nimbleness, diplomacy – and determination.

Over the last two years we have learnt some invaluable lessons on dos and don’ts of TPM. Here we have summarised some practical lessons on how to make it work.

  1. Do invest in relationships. Although a TPM is ultimately accountable to the client/donor, the success of its work is largely reliant on the relationships with the programme implementers. Therefore, there is a delicate balance between maintaining the TPM’s independence, and developing a productive and collaborative relationship with the implementers.Implementers’ priority will always be their own workplan delivery, not that of the TPM’s.They are also the ones who will provide access to the data and information the TPM needs, and they are the gatekeepers to implementation sites and staff.An understanding of implementers’ pressure points and a willingness to be patient and flexible are critical ingredients in building these relationships—while also ensuring that clients’/donors’ needs are fully met.
  2. Do take the time to understand implementers’ systems. An ideal setup is for the TPM to be contracted prior to, or simultaneously with, the implementers, so that the TPM can participate in system design discussions and potentially even advise on these. Then, there should be a period to allow these systems to be operationalised, followed by the TPM’s role ramping up. This may be unpopular with donors/clients, who may want the TPM to verify data right from the start. However, giving the implementers time to get up and running, and involving the TPM in the process, ensures the TPM has an understanding of implementers’ systems from the outset. This negates the need for implementers to orient the TPM to their operations – and makes the TPM’s approaches and tools much more fit-for-purpose and less likely to need major revisions over time. It also means the TPM can identify where it can add the most value and complement a programme’s existing M&E, or help address a gap.
  3. Don’t ignore timing. Linked to the point above, it’s important for the TPM and the donor to consider from the outset how the timing of the TPM and implementers’ contracts will affect delivery. Despite best efforts, it’s unlikely the TPM and the implementer(s) will have the same contract period. If the implementer’s contract ends sooner than the TPM’s, the TPM needs to consider whether and to what extent they can carry out their work if the programme is closing out or if certain aspects of the programme are no longer operating. For example, for a service delivery program, if the TPM is verifying implementers’ site level data, that can only be done while those sites are still in operation. Or, if the TPM is conducting an end-of-programme review and wants to interview implementers’ staff, those staff will need to still be in place/accessible when the review is underway.If, on the other hand, the implementer’s contract ends after the TPM, the extent to which the TPM is expected to contribute to legacy studies and final reports must be considered.
  4. Do consider separating the TPM’s ‘auditing’ function and its other functions. If a TPM has both an auditing or verification function, as well as a technical support function, establishing a clear ‘partition’ between these functions can help to ease implementers’ concerns that findings from the evidence and learning will be ‘fed’ to the verification side for scrutiny and reporting to donors/clients. This, in turn, could hinder the implementers’ willingness to collaborate on evidence and learning initiatives. A partition can avoid this pitfall; one way to achieve this could be to have different organisation lead different functions.Alternatively, a partition can be achieved by creating clear ‘leads’, with their own teams, for each function.This doesn’t mean that the TPM shouldn’t work together across functions in all capacities – indeed, it’s important that the TPM have, for instance, a repository for background documents and a way of tracking requests to implementers (to avoid duplication) – but making clear these separations will help to legitimise the TPMs’ functions in the eyes of the implementers.
  5. Do be realistic about evidence generation. Some TPMs, including the WISH TPM, are tapped to generate new evidence to inform adaptive programming. The donor/client’s expectation may be that the TPM develop a research agenda that will help implementers improve programming. But it can be challenging to identify research that is useful to implementers, is of strategic interest to clients and that can be accomplished within the project timeframe. This is especially difficult in projects with short (three years or less) timeframes, when relatively small-scale operations research will be most feasible, but this requires time for the implementers to design interventions that can be researched, for the research to be conducted, and for the findings to inform programming in practice.From the outset, it is important that all parties set realistic expectations and are fully on board with the delivery of the research agenda. To accomplish this, a TPM should agree with the client and implementers on the size, scale and number of research activities that can be feasibly accomplished within the project timeframe, and include these activities as part of the programme’s overall research agenda. Establishing joint accountability mechanisms for research delivery can also help to ensure the TPM’s research is not deprioritized.

Establishing and implementing a successful TPM function can be challenging, but if done well, it can give donors greater confidence in their programmes’ results and be a rewarding, value-add experience for implementers. We hope the lessons we have learned as the WISH TPM will help donors to design effective third-party monitoring contracts, as well as help TPMs and implementers to work collaboratively to improve programming.

Author: Meghan Bishop is the Team Leader for the WISH Third Party Monitoring contract. An expert in global health, with a focus on sexual and reproductive health and rights, Meghan has over 20 years of experience helping governments and mission-driven organisations improve the quality of health systems and services through the use of evaluation techniques, monitoring and learning uptake methodologies.

Area of expertise