R4D authors article on ‘getting rigor right’ when it comes to evaluation

May 3, 2023

WASHINGTON, D.C. — Results for Development (R4D) has published a new paper, “Getting Rigor Right: A Framework for Methodological Choice in Adaptive Monitoring and Evaluation,” in the Global Health: Science and Practice journal. The article presents a framework for using the right level of rigor in testing and measuring program designs and approaches.

The field of international development has embraced the idea that programs require agile, adaptive approaches to monitoring, evaluation and learning. But considerable debate still exists around which methods are most appropriate for adaptive learning — an approach in which real-time evidence is generated to improve program impact.

Researchers have a range of proven and novel tools — including lean testing, rapid prototyping, formative research, and structured experimentation — all of which can be used to generate feedback to improve social change programs. But with such an extensive toolkit, how should one decide which methods to employ?

“Our paper builds on more than eight years of experience working with partners to get them the right evidence to make decisions at the right time,” said Christina Synowiec, R4D’s Evaluation & Adaptive Learning practice lead. “We’ve learned a lot through our work about what ‘getting rigor right’ means in practice — and we believe this is what makes our approach so unique. We care about investing in the upfront process to design research and learning questions our partners care about, and then working with them to design the methods that will help to answer those questions. And this paper summarizes what that decision-making process looks like in practice.”

The article illustrates the framework’s use in three different programs — one program focused on deploying dozens of small, short-term grants in response to the COVID-19 pandemic (COVIDaction Resilient Health Systems), one program focused on supporting a data analysis and visualization platform for health systems in the Pacific Islands (Tupaia), and one focused on a USAID initiative in Cambodia aimed at preventing unnecessary child-family separation (Family Care First Cambodia).

Each case study describes the feedback methods used and why, how the approach was implemented (including how R4D conducted cocreation and ensured buy-in), and the results of each engagement. The case studies also outline lessons learned and how to select the right kind of responsive feedback mechanism to improve social change programs.

The article was written by members of R4D’s Evaluation & Adaptive learning practice, including Ms. Synowiec, Erin Fletcher, a former economist at R4D, Luke Heinkel, a program director, and Taylor Salisbury, an associate director.

###

About Results for Development
Results for Development (R4D) is a leading non-profit global development partner. We collaborate with change agents — government officials, civil society leaders and social innovators — supporting them as they navigate complex change processes to achieve large-scale, equitable outcomes in health, education and nutrition. We work with country leaders to diagnose challenges, co-create, innovate and implement solutions built on evidence and diverse stakeholder input, and engage in learning to adapt, iterate and improve. We also strengthen global, regional and country ecosystems to support country leaders with expertise, evidence, and innovations. R4D helps country leaders solve their immediate challenges today, while also strengthening systems and institutions to address tomorrow’s challenges. And we share what we learn so others around the world can achieve results for development too. www.R4D.org

Photo © Results for Development/Antoine Raab

Global & Regional Initiatives

R4D is a globally recognized leader for designing initiatives that connect implementers, experts and funders across countries to build knowledge and get that knowledge into practice.