Monitoring and evaluation: Lessons from a year like no other

Monitoring and evaluation: Lessons from a year like no other


In this edition of NousCast Shorts we speak to Nous Director Annette Madvig about major trends in monitoring and evaluation and consider the big three questions everyone undertaking M&E needs to think about.

You can also read Annette’s article, “Monitoring and evaluation can help us act on the lessons from a year like no other”.

About NousCast Shorts

The NousCast Shorts podcast series brings you fresh thinking on some of the biggest challenges facing organisations today. Each episode of NousCast Shorts will feature a rapid-fire interview with a Nous consultant about an emerging issue in their area of expertise – in about the time it takes to have a cup of coffee. 

2021 is a year for making sense of the continuities and disjunctures of 2020 and learning what they mean for this year – and the years to come.

Annette Madvig 

Director, Nous Group

Ari Sharp: Hello, and welcome to NousCast Shorts, a podcast that brings you short and sharp insights from the team at Nous Group, an international management consultancy. I’m your host, Ari Sharp, and today on NousCast Shorts, we’re talking to Annette Madvig, a Nous Director who specialises in monitoring and evaluation.

Annette previously worked for the Australian Government Aid Program in both Australia and Timor-Leste, as well as for non-government organisations. Recently, Annette wrote an article for Nous Insights explaining how monitoring and evaluation can help us act on the lessons from a year like no other. Looking back over 2020, Annette considered how the pandemic changed our understanding of the world and how that can help organisations to rethink their approach to monitoring and evaluation. Annette Madvig, welcome to NousCast Shorts.

Annette Madvig: Ari, thank you so much for having me.

Ari Sharp: Annette, you’ve written for Nous about what the disruption of 2020 means for monitoring and evaluation. Tell us, what did you find?

Annette Madvig: My sense is that there’s a lot we can learn during this year from what happened last year, and the ongoing changes that we’re undergoing; and monitoring and evaluation are really great tools to help us do that. When I think about last year, I think about how tough it was for so many people and so many organisations. The pandemic’s obviously had devastating impacts in Australia and around the world, and many people are still experiencing those effects and still experiencing that devastation. At the same time, many exciting things happened in that flux that the pandemic created. People came together with a renewed sense of community. Policy decisions were made fast that are hard policy decisions, but which really – certainly in Australia – maintained people’s access to essential services so that they could stay healthy, they could keep working, they could stay safe at home, or had housing.

And so, I think from what was quite a confounding year and the ongoing pandemic, there’s going to be a lot that we can learn – not only about what doesn’t work or what was hard in the pandemic, but what did work, what’s changed in how we see ourselves, what’s changed in how we see our major policy challenges, and what have we learned from those new ways of doing things or those adaptive decisions that can give us clues as to how we might keep delivering great policy and program in the years to come, that really respects the fact that we’ve been through this enormous turmoil. And there are both challenges and opportunities in that rather than expecting that the world will go back to the way it was.

Ari Sharp: In the piece, you bring up three big questions that you think organisations need to be asking about monitoring and evaluation. I’m keen to run through those with you. The first question is: what do we want to learn?

Annette Madvig: There’s probably three things that I thought are important to learn about this year in the context of the pandemic, and these are sort of longstanding monitoring and evaluation questions, but I actually think we can look at them slightly differently because of the pandemic.

Firstly, we want to learn about context. What are the evolving circumstances that we’re operating in, and what do these changes mean for what we as an organisation or a program want to achieve? So that sense of understanding that the world doesn’t stand still, we need to monitor what’s happening and think about it and adapt what we’re doing.

The second is delivery. Obviously the pandemic interrupted a lot of programs last year. The kind of achievements in education or health and human services that an organisation may have expected to make last year, they may have been delayed or they may be different from what they thought they were going to be. And so there’s a question to ask: where are we at compared to where we thought we’d be and why? And also, what is the right thing to do now, given the evolving circumstances?

And finally, we want to look to outcomes. This is about keeping our eye on what is changing for the people who have the most at stake in an issue. And I think particularly with COVID-19, we know that its impacts and opportunities are differentiated: not everybody has had the same experience, not everybody will have the same recovery. And so we really need to understand what’s happening for whom and why through our program. What can we influence and make sure that we’re designing and delivering our intervention to have the best possible outcome for the people that we care about?

Ari Sharp: Moving on from what we learned, the next question comes to how we’ll learn, and from whom. What did you find there?

Annette Madvig: Thinking about working out robust methodologies is always a part of monitoring and evaluation. So we know that the people who are going to use our findings need to be confident that we’ve used a method that is delivering credible findings that they can rely on. So there’s obviously a long history of careful research design, and we want to keep on bringing what we know about good research design into our monitoring and evaluation. But I think something that last year has taught us is that there’s room for adaptation and innovation in how we collect data. In one of the evaluations I worked on last year in education, we weren’t able to do the research with students that we had planned to do, but what we could do that was useful for our client and their stakeholders was to use real-time data about the evolving economic and social situation, to understand the context in which they were implementing the program and to confirm the relevance of the program in that changing context.

And so we adapted what we focused on and how we got that information. We also did a series of online workshops to get information quite quickly from the people who were implementing it, the program, to understand how they had adapted the program to the circumstances. This year, as things stabilise, we hope we’ll be able to start talking to students and we’ll use some of the methods that we designed to do that. But I think, so that the question is: in the kind of situation we’re in, how do we marry up robust design with flexible iterative, innovative research design?

Ari Sharp: Looking to the future, the third question you ask is: “what will we do with what we learned?”. What conclusions did you reach?

Annette Madvig: We must keep in mind: what’s the end purpose of monitoring and evaluation? It’s all very well to put time and money into collecting data and talking to stakeholders, but it’s not really a great use of your time and money (and it’s not a particularly great way to endear yourself to your stakeholders) if you do nothing with the data that’s collected – if the report just sits on the shelf. So really we’re interested in monitoring and evaluation because it informs the decisions that organisations make about what to do next. It’s really about bringing that evidence-base into program planning and program design, and using what we’ve learned to improve our programs and do something better and differently in future.

And so, again, in this changing world that we’re living in, there’s such a great opportunity for organisations to say, “things have changed: we are acting in the world, we’re doing the best we can. We’re being adaptive and proactive. What can we learn from all of that?” – and even better – “What does that mean for us next? What can we do to help our communities and organisations thrive this year and in the following years, when we know that we will still be adapting to the effects of the pandemic?”.

Ari Sharp: Annette Madvig, thanks so much for talking to NousCast Shorts.

Annette Madvig: Thanks Ari, it’s a pleasure.

Ari Sharp: That was Annette Madvig, a Director at Nous Group. You can find Annette’s article, “Monitoring and evaluation can help us act on the lessons from a year like no other” on the Nous website. You can also contact her directly via email and LinkedIn. We’ll provide links in the episode notes.

Before we go, some information on us at Nous Group: for more than 20 years, Nous has offered a broad consulting capability that allows us to solve our clients’ most complex strategic challenges and partner with them through transformational change. We’ve contributed to significant agendas in Australia, the United Kingdom and Canada; including shaping the future of higher education, advancing indigenous reconciliation, digitally transforming service delivery and developing models of regulation for the future economy. You can find out more about Nous, meet our people, and read our insights at our website

That’s all for this edition of NousCast Shorts. You can subscribe on your favorite podcast app so you don’t miss an episode. And if you like what you hear, please rate and review us to help other people find it.

Thanks for listening. We’ll catch you next time.