Ask an Evaluator: Turning Evaluation Findings into Action
At James Bell Associates, we believe evaluation should do more than generate reports—it should help programs make informed decisions, strengthen services, and improve outcomes for children and families. Here, we explore how evaluation findings can move beyond data collection and become practical tools for learning at improvement.
We spoke with Kim McCombs-Thornton, a senior evaluator at JBA, Principal Investigator on multiple projects with the Parents as Teachers National Center, and researcher who worked on studies on family engagement in home visiting, about how evaluators and program staff can work together to ensure findings are meaningful, usable, and connected to day-to-day practice.
Interviewer: In your experience, why is it important to integrate evaluation findings into practice?
“I love working with program staff because they are deeply committed to the families they serve and genuinely want to help children and families have a better life,” Kim explains. “At the same time, evaluation can sometimes feel intimidating. Evaluators rely on program staff to provide data, which adds to their workloads. And evaluation can feel like a review of their performance or extra work rather than a collaborative opportunity to strengthen their work with families.”
For Kim, the key is ensuring evaluation offers something valuable in return. “Good evaluation is a commitment to staff. We are not just asking for their time, ideas, and effort; we are also working to give back useful information they can use to do their work even better. That’s the hook. That’s the area of overlap in the Venn diagram. That shared goal of improving family experience and outcomes is what brings us all together.”
When findings are intentionally shared back with program staff and collaboratively considering what the findings mean, why the results are what they are, and programmatic changes to help them get closer to the goal, evaluation becomes far more relevant. Staff can help interpret what the results mean, explain why certain patterns may be emerging, and identify realistic changes that move programs closer to their goals.
Interviewer: What happens if we don’t make an intentional effort to do so?
Family-serving organizations often operate under significant pressure, balancing ambitious missions with limited time and resources. Reflecting and identifying improvements based on the findings can easily get pushed aside. “If we do not intentionally make time to review and apply findings,” Kim says, “then all of the effort that went into collecting data—staff time, trust, and findings—may not lead to any real benefit for families. Without a deliberate process for applying learning, organizations are missing important opportunities to strengthen services and advance their mission.”
Interviewer: How can evaluation feel more approachable for busy implementation staff?
Kim emphasizes that partnership is essential. Programs are far more likely to engage with evaluation when staff are included throughout the process, from planning to data collection to data interpretation. “In the planning phase, direct service staff often have insights that program management may not have,” she notes. “Engaging staff in discussions about what will be important to learn in an evaluation—and what outcomes are feasible—can yield a study that gains their buy in from the beginning.”
The same principle applies to data collection. Understanding what staff already collect, how systems operate, and where data collection naturally fits into workflows helps reduce burden and improve quality. “When evaluators make the effort to understand staff realities, people respond positively,” Kim says. “They know their work has been seen and respected.”
That investment often pays off later when findings are shared. “I have seen staff light up when findings reflect what they have been observing all along, and they are equally eager to help explain results that differ from expectations.”
Interviewer: How can we create a culture of inquiry, where people value improvement?
According to Kim, leadership sets the tone. “When organizational and program leaders openly ask questions, reflect on what is working, and acknowledge what is not, staff feel permission to do the same.” She recalls one executive director who brought a bright blue feather boa to staff meetings. Each week, that leader would wear it while sharing a recent mistake and what she learned from it, then pass it around the room so others could do the same. “It sounds simple,” Kim says, “but it created a culture where mistakes became learning opportunities instead of something to hide.”
Interviewer: How can JBA help programs who want to use evidence from their evaluations?
Kim describes JBA’s approach as collaborative, practical, and grounded in program priorities. “We start by listening. What does the program want to learn, and why does it matter to them? Ideally, we also include family perspectives in shaping the evaluation focus. Then we work together to craft evaluation questions and a study design. JBA frequently uses mixed-methods designs that are useful for a 360 perspective. The combined quantitative and qualitative data provide a fuller picture of outcomes and experience. We work closely with the program along the way. As findings emerge, we prepare informal briefings to share results and work with staff to understand what they mean and next steps for more analysis to further understand what is going on. Once results are in, discussions may focus on which findings the program wants to celebrate and which ones they want to improve upon. This is where continuous quality improvement, or CQI, becomes especially valuable: using findings not simply to understand performance, but to test changes and strengthen implementation over time.
From Evaluation to Action
Kim’s perspective closely reflects the themes explored in JBA’s new brief, Turning Evaluation Into Action, which outlines four practical strategies for ensuring evaluation findings lead to meaningful change:
- Build strong evaluation-practice partnerships
- Cultivate a culture of inquiry
- Align evaluation efforts with program priorities
- Apply findings through CQI
Together, these strategies help organizations shorten the gap between learning and implementation—ensuring evaluation does what it is intended to do: support better decisions, stronger programs, and improved outcomes for families.
This piece reflects JBA’s broader approach to translating evidence into actionable strategies for implementing agencies through evaluation, technical assistance, and continuous improvement support. To learn more about partnering with JBA, visit our Working With Us page.