Blog Home

3 ways to ensure data doesn’t sit on a shelf

Insights from Our Work in Early Childhood Development

When parents and caregivers provide nurturing care, they create the conditions and environments needed for young children to grow up healthy, happy and safe. The global community increasingly agrees that parenting programs — or interventions that provide guidance and support to caregivers around communication, psychosocial stimulation, non-violent discipline and play, among other topics — are critical investments.

And while we might agree on the why, we are still figuring out the how: how best to communicate key messages to parents, how to change attitudes and behaviors and how to do this all at scale and in a cost-effective way.

As researchers and evaluators, we’re eager to tackle these big questions — but we also know how hard it can be to produce data that doesn’t just “sit on a shelf.” Given scarce resources currently dedicated to early childhood development services, it is particularly important that the research and evidence we produce is as useful, timely and actionable as possible. And it’s important to consider who is sharing evidence as well, since new research shows evidence translators play a key role in helping policymakers make informed decisions.

Results for Development (R4D) recently engaged with two promising parenting programs on the cusp of expansion in Serbia and Peru.

  • In Serbia, the Program for Children and Families, Strong from the Start — Dam Len Phaka (Give Them Wings) seeks to increase children’s school readiness through support for both parents and children in home-based and community-based workshops. R4D was hired to conduct a quantitative impact evaluation to explore whether school readiness outcomes were correlated with participation in the program. Parents responded to a survey on their attitudes, perceptions, and behaviors around positive parenting practices while children participated in a direct assessment measuring topics such as emergent literacy, emergent numeracy, socioemotional development, and persistence in tasks.
  • In Peru, Cuna Más operates a home visiting service in rural communities where, each week, volunteers visit and work with caregivers and children under 3. An impact evaluation showed promising results, but the program has encountered obstacles in scaling up and sustaining its model, in part because of high turnover among its workforce. R4D, together with GRADE, conducted a largely qualitative analysis by reviewing Cuna Más documents and conducting interviews and focus group discussions with the professionals and volunteers delivering these services, with the goal of understanding workers’ motivation and satisfaction, training, support, workload, and compensation, and how these contributed to or hampered the program’s sustainability.

While carrying out these evaluations, we grappled with how to produce and present evidence that parenting programs can and will actually use. Here are a few lessons from our work in Serbia and Peru on how to collect the right evidence and share it in the most useful and appropriate ways for decision-makers.

1. From the outset, co-design research and evaluation with the end users.

Research and evaluation are frequently added as a bit of an afterthought to funding and designing programs, but collaboration between implementers and researchers in early stages can pay dividends later. Programmatically, researchers can help craft a Theory of Change that is measurable and testable against program goals. And from the evaluation side, early engagement typically increases the research team’s ability to add rigor.

In Serbia, for example, the Theory of Change crafted by the program team was difficult to test and researchers were unable to randomly assign families to the parenting program, which limited the rigor of the impact evaluation. Our study in Peru had a global audience and objective in mind (to grow the knowledge base around the early childhood workforce), but we could have designed our study more in partnership with the program, to consider their specific challenges and information gaps at the time. This would have helped to ensure that our findings and recommendations fed more directly into the program’s decision-making processes.

2. Use quantitative and qualitative approaches.

Our quantitative evaluation in Serbia provided exciting results, but they were immediately followed with questions like, why did parents report a decrease in negative parenting practices (like spanking) and a decrease in positive parenting practices (like praise)? Qualitative data would have been expensive to collect, but we should have pushed harder to ensure that this evaluation was both qualitative and quantitative from the beginning. The good news is program funders are currently designing a qualitative scope to answer some of the questions that emerged from the quantitative analysis.

In Peru, we collected a wealth of qualitative data through interviews and focus group discussions which gave rich insight into how motivated the program’s volunteers and staff were, how prepared they felt to carry out their work, and what daily challenges made it hard for them to do their jobs well. [For additional details about these findings, click here.] We also analyzed budget data to understand current program costs and estimate the costs of eventual scale-up and different reforms we were recommending — this costing exercise proved instrumental in ensuring our findings were actionable. But we didn’t have the hard data on workers’ performance or child outcomes to complement our qualitative data and tell us what about workers’ experiences most affected program quality, what would improve their productivity and reduce turnover, and which inputs (such as in-service training, wages or materials and resources) were more cost-effective than others. This is not to minimize the value of the data we did collect — it’s critical, and we so rarely know what interventions actually cost and what the experiences of frontline workers are.

3. “Workshop” the report.

Publishing a report does not mean the recommendations will get used. Funders, policymakers and implementers often need additional support translating the research into action.

In Serbia, we were fortunate to have an opportunity to meet with the donor community to explain these findings and to discuss the results with implementing partners in detail to inform the upcoming year of implementation. In Peru, we presented to ministry and program officials and spent hours discussing the fine details so that they understood exactly what our research meant for them and could ask us any follow-up questions that our report didn’t necessarily answer.

We’ve reflected on and refined our evidence generation and dissemination strategies around parent engagement through working with these partners in Serbia and Peru. These learnings, along with our work engaging parents in their community school’s development in Tanzania and supporting parents reading to their children using mobile technology, are generating greater understanding around the power of parent engagement in tackling some of our thorniest development challenges.

Photo © Lane Goodman/R4D

Leave a Reply

Comment Guidelines

Your email address will not be published. Required fields are marked *

Global & Regional Initiatives

R4D is a globally recognized leader for designing initiatives that connect implementers, experts and funders across countries to build knowledge and get that knowledge into practice.