Close up on an old-fashioned water-pumping windmill

Meaningful stories for mutual benefit: The importance of participatory evaluation for community organisations

Our Thinking | insight

Published

Authors

5 Minute Read

RELATED TOPICS

Share insight

Idea In Brief

The impact of place-based initiatives is hard to measure

Participatory evaluation allows local insights and experience to inform data collection and analysis.

Funders should consider five factors

Co-design with community, tailored engagement and support, shared value created through exchange, inclusive sense-making, and a willingness to listen are all crucial to evaluations.

Prepare for evaluation during the design phase

Embedding the principles of participatory evaluation early is vital for capturing valuable insights and driving continuous improvement.

Karen and Janine chatted to us over a cuppa in Bruce Rock, a small farming community 250km east of Perth. The pair co-founded Wheatbelt and Beyond Youth Mentoring (WBYM), an initiative connecting Wheatbelt District High School secondary students with mentors from outside their community.

WBYM received grant funding through the Future Drought Fund’s Helping Regional Communities Prepare for Drought (HRCPD) Initiative. This flexible funding has helped them to expand their program to reach more young people.

The funding came with all the usual grant reporting and evaluation requirements. Karen and Janine emphasised that busy community leaders like themselves value simple and efficient reporting processes, especially given everything else they have on their plates at any given time. They also stressed the importance of outcomes, telling us that grant reporting and evaluation should be rooted in stories about the impact of funding, where all too often they are, instead, mere administrative requirements.

Such stories of impact do more to justify funding than simplistic box-ticking exercises do. They also allow organisations to reflect on what they have achieved through their work. We have learnt that participatory evaluation is the best way to unearth these meaningful insights, which community organisations themselves can learn from, while minimising the administrative burden on those same organisations.

Prioritising place by listening to locals

There is growing scrutiny around government grant spending and mounting pressure to demonstrate value and effectiveness. At the same time, governments are increasingly implementing place-based policies that rely on community organisations to deliver localised outcomes.

These policies are commonly designed to address the specific circumstances of a region, particularly in response to complex, systemic issues. For example, the HRCPD Initiative is a place-based program intended to allow agriculturally-dependent communities to design and deliver projects that meet local needs and build social resilience in the face drought.

Although place-based initiatives are not new, their impact remains hard to measure. This is due to their complexity, the range of stakeholders involved, the diversity of activities they fund, and the long-term nature of the outcomes. To overcome these challenges, adaptive and participatory monitoring, evaluation and learning frameworks are essential.

Participatory evaluation is an approach that actively involves the stakeholders of a program or policy in the evaluation process. This approach allows local insights and boots-on-the-ground experience to inform both data collection and analysis. In addition to yielding more relevant and meaningful data – locals like Karen and Janine know what to look for, who to talk to, how to define their own success, and how best to measure it – it also fosters local capacity and ownership over the results.

Evaluators, whether funders or independent contractors, can then aggregate this locally-collected, locally-analysed, project-specific data to assess the overall success of the program. This collaborative process is designed and conducted in such a way – which is to say with a great deal of respect and patience – to ensure that it elicits genuine input from community members and delivers them genuine, meaningful returns. The last thing you want for them or the project is for the people you’re relying on for insight to feel that your engagement with them is tokenistic.

X

Co-designing, tailoring engagement, and bringing communities along for the ride

We are currently working with the Foundation for Rural and Regional Renewal (FRRR), the Australian Rural Leadership Foundation (ARLF), and the Commonwealth Department of Agriculture, Forestry and Fisheries to conduct a participatory evaluation of the HRCPD Initiative. This is what took us to Bruce Rock in the first place. It’s why we were out there enjoying a cuppa. Recognising that the initiative’s participants are busy, regional community leaders, we designed an evaluation approach that minimised data collection burdens while delivering genuine value.

We have done similar work with a variety of organisations, including a recent evaluation of place-based investments in youth programs across NSW. Our participatory approach blends robust evaluation with capability building, placing outcomes for regional participants at the centre of the process. Based on our experiences with place-based program evaluation, we believe there are five factors that funders need to pay attention to.

Co-design with community – Involving community in the evaluation design process is essential as it leverages their practical knowledge and lived experience. By incorporating co-design practices, funders can tailor program outcomes to specific local contexts while ensuring alignment with their strategic policy goals. We conducted capability-building Evaluation Working Sessions with HRPCD Initiative delivery partners to define what project-level success looks like and to test our evaluation approach. This collaborative process ensured that the evaluation accurately reflects the diverse, place-based objectives of each project.

Tailored engagement and support – Community organisations have varying levels of monitoring, evaluation and learning maturity. Funders should provide tailored engagement and support to accommodate these differences, respecting the multiple roles regional community leader’s juggle. Adequate training is essential to build capabilities, promote efficient data collection and reporting, and foster a sense of autonomy and self-governance. We developed tailored data collection tools and guidance for HRCPD Initiative delivery organisations, empowering them to accurately and efficiently collect project-level outcome data. The materials aligned to the overall evaluation outcomes while offering guidance on tailoring approaches for local contexts and specific project impacts.

Shared value created through exchange – Many community organisations may initially see evaluation as yet another administrative burden. To overcome this perception, funders must demonstrate how monitoring and evaluation activities create value. For example, we worked with FRRR and ARLF to streamline processes to avoid duplication, illustrate how participants can use the data they collect to drive program improvement or inform future initiatives, and share the evaluation reporting with participants so they could leverage the findings for future funding.

Inclusive, transparent sense-making – Participatory evaluations have the potential to bring diverse stakeholders together. In the case of the HRCPD Initiative evaluation, we facilitated collaborative outcome summit workshops to collaboratively test, validate, and refine the evaluation’s findings with participants. The workshops allowed participants to reflect on what worked well and identify areas for improvement. This process achieved to key outcomes: it promoted knowledge sharing and enhanced program improvement among the funded organisations; and it made the evaluation insights more relevant, meaningful and actionable.

Willingness to listen to partners and adapt – It is essential to listen to partners and then adapt approaches based on their feedback. By fostering a culture of active listening, funders and evaluators can gain valuable insights and make necessary adjustments to maximise the effectiveness of monitoring and evaluation. Nous has worked in close partnership with FRRR, ARLF and DAFF to deliver an iterative approach that has enabled continuous improvement. For example, we have adapted when and how we visit regions to collect data based on feedback from the local delivery partners. Across this process we have shared lessons learned and collaborated with diverse partners to minimise the burden on regional stakeholders.

Feedback has confirmed the mutual benefits of a participatory evaluation approach. The evaluation has focused on what matters most to measure while supporting community organisations to effectively gather the data they need. All organisations now have access to a repository of information and examples they can draw on for the HRCPD Initiative and future programs, enhancing their capacity to evaluate their impacts meaningfully.

You can read more about our evaluation in the mid-term evaluation report for the HRCPD Initiative. The report confirmed that the initiative is making good progress towards strengthening community capacity and the social resilience of agriculture-dependent communities to prepare for the impacts of drought. The full report is available online.

Embedding at the beginning

Adopting a participatory approach such as ours requires a significant investment of time. We recommend starting at the very beginning, embedding monitoring, evaluation and learning during the program design phase. This foundational step is vital for fostering a culture of evaluation that captures valuable insights and drives continuous improvement. Having a trusted partner with evaluation expertise can ensure that you’ve got your settings right before you get properly underway.

Our work on the HRCPD Initiative evaluation and other evaluations like it highlights what can be achieved through community-led, participatory, and evidence-informed action. While there is still much work ahead for people like Karen and Janine, with HRCPD Initiative projects rolling out across Australia until June 2025, energy and motivations remain high as communities actively collaborate to achieve lasting outcomes. The pair have told us that their role in the evaluation has given them valuable insights into outcomes. More importantly, they feel more confident in their ability to advocate for their communities.

Get in touch to discuss participatory evaluation and how it can benefit your organisation.

Connect with Carlos Blanco and Sally Higgins on LinkedIn.