Blog Home

Learning Checks: Connecting evidence, implementation and action

During a workshop with Read to Kids partners in 2017, the R4D team threw this rather unorthodox graphic up on the screen.

Two data points struck one of the participants as she surveyed the lollipops of information: an event that just four parents had attended produced three readers who were engaging with the Read to Kids reading app on a regular basis; by contrast, an event with a crowd of 80 parents had yielded not even a single reader who engaged regularly with the app. The participant pulled out her diary and looked back at the activities her team had done with the parents during each of these events.

“I knew it!” she exclaimed, as she nudged her colleague in excitement. She saw a pattern no one else had recognized: The higher-quality interaction with the smaller group of four parents seemed to have outperformed the larger scale, but lower-quality, interaction with the group of 80 parents.

Because she had been in those rooms with those different-sized groups of parents, this participant had a hunch about what the data could be reflecting, and she felt confident in sharing that insight because she was doing so in a setting created just for this purpose: A Learning Check.

As R4D’s Evaluation and Adaptive Learning practice team developed its methodology, we noticed a gap in how our partners learned together. Our projects have always included presentations to our funders about outcomes and presentations to other stakeholders to disseminate research findings. They have always included time for our partners to pause and reflect on implementation successes and challenges. What these discussions lacked, though, was space for participants to build strong relationships, to react to data as soon as it became available and to turn insights into actions.

We developed Learning Checks to fill this gap. A Learning Check is an intentional and specifically timed opportunity to figure out what we know and decide what to do about it. In this context, we also try to help stakeholders to think differently than they do in their day-to-day work.

For example, research teams who only look at data may have blind spots about what happens during implementation. Implementation teams may have blind spots connected to their own intuitions about why people act the way they do during project implementation – intuitions that may not be supported by the data. Giving equal weight to evidence generated by the research team and tacit knowledge shared by the implementation team builds trust, connects the dots between learning and implementation and ensures evidence-based decision making.

In this blog, we’ll walk through the different elements of successful Learning Checks.

1. The ‘What’

Reflection activities are a standard part of most project timelines. Learning Checks, in contrast, move beyond reflection because they include specific data that is tailored to a few focused learning questions that partners can react to, and there is an expectation that partners will act on the insights that the data provide.

A strong Learning Check always includes three key components:

  1. Sharing evidence. Our team presents findings from data collection and analysis, typically focused around three or four key questions and takeaways that are central to our partner’s upcoming decisions. The goal is not to disseminate or review every single finding the team has generated. This data sharing should avoid technical jargon and be accessible for all stakeholders in the Learning Check.
  2. Sharing tacit knowledge from implementation. Tacit knowledge – information that implementers gain from their personal experience that may not be visible in data – is essential when considering a program’s operations. Implementers should share this knowledge and experience from carrying out program activities during a Learning Check. Rather than summarizing their entire workplan, the implementing partner should answer the three or four questions from the learning partner.
  3. Action planning. After all stakeholders have been able to honestly and vulnerably share their progress, the team can begin to coalesce around actionable and impactful next steps. At the end of a Learning Check, the project team should have a concrete, realistic plan for how to move forward.

2. The ‘Why’

Learning Checks, by design, are much more resource-intensive than ordinary data presentation or reflection meetings.

Even at a higher cost, we believe Learning Checks add value to a project because they:

  1. Provide a framework for decision making. Because Learning Checks center implementing partners, they generate conversation about both the existing questions that the learning partner might have uncovered and additional unknowns that implementers raise. The action planning that occurs during a Learning Check also creates additional value for implementers because they serve as a guide for future program reflection and planning.
  2. Shift power to people who don’t always have a voice in implementation. Implementing partners are often left out of the conversation about data collection and analysis, especially when funders use evidence review as a means for holding implementers accountable. Learning Checks engage community stakeholders and frontline implementers in making sense of the data, creating a more grounded and equitable analysis process.
  3. Bring alignment and understanding among partners. Learning Checks are not a forum for holding partners accountable but rather an opportunity to have honest conversations about project successes and challenges. Creating transparency between partners in this way can help to build trust in a project team and make collaboration more seamless.

3. The ‘Who, When, and Where’

Learning Checks’ focus on honesty facilitates realistic action planning, which is part of what makes them so effective. Creating this unique environment requires a discerning eye toward who is included in a Learning Check, as well as when and where it takes place.

Learning Checks should:

  1. Mix and match key stakeholders throughout the Learning Check to get the most out of each session. It’s important to curate the right mix of funders, implementers, community leaders and community members in a Learning Check. Selecting different participants for certain sessions shifts whose voices are prioritized and helps create honesty in conversations around accountability, program effectiveness, implementation uptake and more. For example, if a funder is present, the tone of the room changes and significant deference may be paid to their ideas, whereas junior staff at the implementing partner organization who deeply know the intervention may not share their incredibly relevant experiences.
  2. Convene stakeholders at the specific moment when data can inform decision-making. The Learning Check will require prioritizing just a few key data points that provide insights into timely questions. The specific data and learning questions to focus on should be selected based on what decisions program staff and funders need to be making at that time. For example, in the lollipop diagram that began this blog post, the team was deep into implementation, but they were not seeing readership hit the outcome levels they had targeted, so that learning check was all about effective engagement strategies. Earlier on in that same pilot, the Learning Check focused on getting the right content onto the mobile app in the first place. It is all about choosing the right question for the right time.
  3. Occur in-person. Learning Checks are an important time for all program stakeholders to meet face-to-face. Virtual Learning Checks often hamper implementing partners’ ability to contribute due to connectivity challenges, limited access to presentation software and inherent barriers to virtual relationship-building. If a Learning Check must be held virtually, we recommend including an orientation on virtual facilitation tools and ice breaker activities. It is also useful to identify an in-person champion, even on the implementation team, to reinforce virtual participation.

4. The ‘How’

Learning Checks should be participants’ most fun calendar event of the month. But for a Learning Check to be fun, they must be well-organized, equitable and allow enough time for meaningful conversation.

  1. Set clear expectations but allow for flexibility. Everyone involved in a Learning Check should know who is participating, why and what will be accomplished. Facilitators should be prepared to pivot, sometimes mid-session, to keep the conversation moving in a productive direction.
  2. Prioritize local partner engagement and accessibility. Local partner facilitation and accessibility are key to engaging Learning Checks. In addition to sharing facilitation duties between stakeholders, Learning Checks should include clear data visualizations that allow teams to react to evidence and generate their own takeaways.
  3. Allow enough time for meaningful connection and work. It can take time for partners to understand the purpose of a Learning Check, gel with the group of people present and digest research findings. If possible, allow for multiple sessions across a few days to allow participants enough time to think deeply and respond richly to the Learning Check.

At the beginning of this blog, I shared an anecdote about a Learning Check that inspired an important realization for one of its participants: that facilitating more intimate conversations was more effective for their program than holding large sessions. Junior staff at the partner organization were able to help decision-makers understand the data and redesign engagement activities.

That partner pivoted to holding smaller events and focusing on follow-up conversations with attendees, and this modified intervention yielded sustainable community engagement and led to the establishment of long-term relationships with local community organizations.

This is the power of a Learning Check: Convening a variety of stakeholders — including those who are often overlooked — at the right time to share learnings and data, to reflect on implementation and to plan for the future can be a driver for long-term program success.

Leave a Reply

Comment Guidelines

Your email address will not be published. Required fields are marked *

Global & Regional Initiatives

R4D is a globally recognized leader for designing initiatives that connect implementers, experts and funders across countries to build knowledge and get that knowledge into practice.