The Economic Impacts of COVID-19 Will Force Us To Do More, Better, Faster (Again!) Luckily, We’ve Become Much Better at Performance Monitoring

The Economic Impacts of COVID-19 Will Force Us To Do More, Better, Faster (Again!) Luckily, We’ve Become Much Better at Performance Monitoring

COVID-19 and its economic impacts will challenge international cooperation, development, and humanitarian action to demonstrate value with an unprecedented level of precision. The World Bank predicts that the global economy will shrink by 5.2% in 2020 with a continued decline of between 3.9% and 4.2% in 2021.  That constitutes the deepest recession since World War II.[1] Not since the 2007/2008 financial crisis have the implications been so stark. Governments will tighten budgets while demanding that we do more, better, faster, and more cheaply.

Luckily, we’ve learned a lot since 2008. We have been developing performance monitoring systems that can get into the nitty-gritty of results, even in the most complicated operating environments. This has been built on the convergence of cheap mobile telephony, the proliferation of on-line data aggregators and dashboards, and the grudging acceptance that, actually, concepts around competitive advantage and core competencies, around economies of scale and cost ratios, even around return on investment and value, are perfectly and helpfully applicable to our work.

We have moved steadily toward a recognition that we work in the most complicated operating contexts in the world and that these demand exceptionally fine-tuned real-time analytics to address the problems that inevitably arise and to do so quickly and effectively. We’ve matured and a more sober assessment of performance is a big part of what we’ve learned.

Figure 1: Screenshot of UK performance monitoring platform showing results of field monitoring of health and nutrition facilities across Somalia, showing those with adequate and inadequate water and sanitation facilities.

Figure 1: Screenshot of UK performance monitoring platform showing results of field monitoring of health and nutrition facilities across Somalia, showing those with adequate and inadequate water and sanitation facilities.

The most advanced performance monitoring system in the world is in Somalia. The UK Government has created a system that has all the technological bells and whistles but that has, more importantly, a turbocharged way to increase performance. (www.mesh-somalia.net) The UK made this investment out of necessity. During the 2011/2012 Somalia famine--an atrocious failure of food systems and the international community—the UK Government literally lost approximately £2 million—yes, ‘poof-gone’ along with the ‘briefcase” NGOs that were funded. Her Majesty’s Government was never going to allow that to happen again.

Figure 2: Screenshot of UK performance monitoring platform showing cumulative results from partner data.

Figure 2: Screenshot of UK performance monitoring platform showing cumulative results from partner data.

So, the UK invested heavily in a leading-edge system—MESH. In the last four years, MESH has conducted over 150,000 surveys of direct beneficiaries through a dedicated call centre, over 12,000 field site visits to assess everything from child protection to agricultural livelihoods to assessments of resilience and urban integration, and has provided over 40 briefs and evaluations that have provided real-time insights into the most pressing performance issues. MESH collects and verifies all UK supported partner micro-level data and leads quarterly performance reviews. All of this is organized and displayed in a consolidated dashboard (screen shots are included in this brief) that allow for real-time performance assessment.

All of this allows people to pinpoint problems early and to then adjust, adapt, improve, and reach better results overall. It is a key driver for doing more, better and faster.

How did they do this? What are the basic systems and lessons that could be applied as more and more people scamper around to build up performance management systems?

Figure 3: Process for developing a performance monitoring system

Figure 3: Process for developing a performance monitoring system

Analyse, with cold, sober thinking, the causal pathways associated with results frameworks and theories of change. Too often, performance professionals take the results framework and theory of change designed at the inception of a programme as the be all and end all of performance. They fail to analyse the precise actions/dependencies/constraints/opportunities/risks associated with activities let alone the assumptions and gaps (known unknowns) that are related to how a programme expects to convert inputs into outputs and how those may contribute towards outcomes and expected impact. Effective monitoring pinpoints the “crunch points” where the whole theory can come tumbling down because of poor delivery and then beefs up its monitoring in those areas.

For instance, in health and nutrition programming, we focus on supplies and staff. That’s it. Because we know that, however broad the programme might be, however much it is trying  to effect supply and demand of critical health services, if the right medical supplies are not there at the right time or if a facility is understaffed, then nothing is going to work.

Getting to this level of precision about what can impede performance requires a lot of clear-eyed thinking about results frameworks and theories of change. People often get it wrong. For instance, billions and billions of dollars were invested in getting children into school so as to avoid a Lost Generation in relation to the Syria crisis. Across Lebanon, Turkey, Jordan, and Iraq, massive efforts were made to get children into school. Unfortunately, the metric they used to measure performance was attendance on the first day of the school year. They didn’t consider the fact that these children and their families faced tremendous social and economic pressures that prevented them from keeping their kids in school. Most kids left school for good after a few weeks. Tragic. It is also a failure to understand the results framework and what factors relate to performance.

Develop an analytical framework. Once the results framework and theory of change have been thoroughly interrogated, performance monitoring needs a framework for how each element critical for performance will be monitored. This should include cohorts, data sources, data collection tools, analytics, and anything else that will be critical for ensuring that these aspects of a programme are performing well. This might include statistically valid samples, a mix of qualitative and quantitative data, the frequency and timing of optimum performance related data collection, etc. It is the “road map” for how each and every monitoring activity will be conducted while ensuring that there are valid links to the assumptions and issues identified during the causal pathway analysis of results frameworks and theories of change. The more time put into this, the better the monitoring results will be.

Avoid “garbage-in-garbage-out” data. It is very important that monitoring surveys focus on the precise issues determined in the analytical framework—the issues that have a tangible and important relationship to results. Too often, surveys get swamped with everyone’s precious indicator and they grow into monsters where easy data analysis related to performance becomes impossible  because of the resulting penchant to report on everything. Keep the surveys focused and then the ensuing data and analysis will be focused.

Display the data and analysis in concise, performance oriented graphics and reports. There is a lot of focus on on-line dashboards, from Tableau to Palantir, from Premise to Ona. Of course, these on-line dashboards are cool—we can access them form our phones! We can put pretty pictures of them in reports, even as above!

This seemingly misses the point. It is more important to appreciate that most people running programmes and projects, especially in complicated operating environments, don’t have time to cull through massive data sets and indicators, no matter how impressively arrayed in on-line dashboards. They need to know what’s working and how to address it quickly. That’s why, as described above,  it is so important to have a good analytical framework that then leads to good forms that can then lead to concise visualizations that show the good, the bad, and the ugly. The graphic of health centre facilities above is a good example. Any casual observer can see which facilities have problems and which  don’t.

Follow-up with actors to see how they interpret the results and what actions should be taken. This step is important and is either ignored with some leaning back on their hunches in admiration of their cool analysis and graphs or it focuses on ‘learning,’ as if this was an important performance outcome.

All of this data is instead meant as a powerful tool for enacting change. It should provide an “alarm bell” that, if not addressed, could lead to a fire that could burn the whole enterprise down. The fire alarm is a good analogy. The monitoring data is the alarm bell. You then need to see the extent of the fire, how to put the fire out, and how to prevent future fires. Concise monitoring data visualizations provide the focus to do this.

Use all of this to identify issues that require more in-depth analysis and use this to improve programming and operations—not the academic tome. Of course, learning is important. Yet, it needs to be based on the issues that impact performance. Too often we tend toward drafting academic tomes, somehow trying to fit into academia, rather than being positioned as additional performance tools: ways to dig into thorny issues in more depth while arriving at practical actions to remedy these. As such, they should be typically linked to workshops or other forums where people can discuss the implications, what still may need to be analysed, and, just like with basic monitoring follow-up, what actions should be taken to improve performance.  Of course, all of this is about learning—learning how to do better work.





[1] “COVID-19 to Plunge Global Economy into Worst Recession since World War II.” The World Bank; 8 June 2020.

Dear US Pollsters: Data is for understanding complexity; not divining certainty.

Dear US Pollsters: Data is for understanding complexity; not divining certainty.

In Yemen, climate change is essential for effective humanitarian action

In Yemen, climate change is essential for effective humanitarian action