Turning Evaluation Into Action
Turning evaluation findings into practical, actionable change is essential to strengthening programs and driving better outcomes for families. Despite evaluation’s focus on real-world applicability, it can take more than a decade for new knowledge to influence practice (Morris et al., 2011). Programs are likely to miss valuable opportunities for learning and growth unless they take action to bridge the delay. This insight brief shares four strategies to drive evaluation findings that are both usable, and used, to catalyze change.
Strategy 1. Build Strong Evaluation-Practice Partnerships
When program staff are involved in planning, implementing, and interpreting evaluations, findings are more likely to be relevant and actionable. Program staff can provide valuable insight into evaluation findings, contextualizing and applying meaningful results to practice. In contrast, when program staff are disconnected from evaluation processes, they may see data collection as a burden and miss opportunities for growth.
Evaluators must maintain objectivity and independence throughout the partnership-building process to ensure that findings are credible, unbiased, and useful. By establishing clear roles and maintaining professional boundaries, evaluators can preserve the integrity of their work while engaging program staff in meaningful ways. In other words, strong partnerships do not compromise an evaluator’s impartiality; rather, they foster mutual respect, transparent communication, and a shared commitment to improvement among everyone involved.
Strong partnerships can be fostered by—
- Implementing regular touchpoints between evaluators and program staff, including evaluator participation in team meetings.
- Fostering open dialogue and feedback loops between evaluators and practice staff to ensure consistent communication and shared learning opportunities.
- Including program staff perspectives during all phases of the evaluation process to reflect their priorities, expertise, and voice, as appropriate.
These partnership principles are echoed in Ask an Evaluator: Turning Evaluation Findings Into Action, which highlights how engaging program staff throughout evaluation planning, implementation, and interpretation helps ensure findings are relevant, trusted, and more likely to inform practice.
Strategy 2. Cultivate a Culture of Inquiry
Evaluation f
indings are most likely to be used when generated in a culture that values learning, experimentation, and reflection. When program leaders embrace such a mindset, staff can follow their lead. Practical strategies for building a culture of inquiry include the following:
- Host monthly data or evaluation huddles for teams to review recent findings and discuss implications for improvement.
- Encourage starting with small change ideas, even when they do not work as planned, to reinforce the goal of learning.
- Train staff to ask critical questions like “Do these findings match our experience?” or “What assumptions are we testing?” to build curiosity and ownership.
- Create a safe environment for experimentation by highlighting lessons learned rather than focusing solely on outcomes.
- Incorporate reflective supervision practices that encourage staff to think critically about what’s working and why.
Strategy 3. Align Evaluation Efforts and Program Interests
Programs must use intentional strategies from the outset to ensure that evaluations explore questions they find meaningful. Evaluation questions generated through dialogue with program staff are more likely to uncover findings useful and applicable to local practice than those that are not. Practical strategies for alignment include the following:
- Develop a shared learning agenda that connects evaluation questions with continuous quality improvement (CQI) priorities and performance measures.
- Include program staff, including CQI staff, on evaluation planning teams to ensure the relevance of evaluation questions and to consider the application of findings from the start.
- Host joint evaluation-CQI planning sessions for staff to codevelop actionable learning goals and metrics.
Strategy 4. Apply Evaluation Findings Using CQI
Programs that routinely use CQI to test new strategies—including those generated through evaluation—and that view failures as opportunities for learning are better equipped to adapt and improve than programs that do not. CQI can help accelerate the adoption of evidence-based changes. Practical strategies for applying evaluation findings through CQI include the following:
- Cultivate leadership support for a structured process to apply and refine strategies identified through evaluation. This process supports programs in actively using their data, not just to monitor progress, but also to drive change.
- Involve program staff in reviewing evaluation findings and designing a CQI project to apply and test these findings in their work.
- Use rapid Plan-Do-Study-Act (PDSA) cycles to help test and adapt ideas based on evidence in real time.
Translating evaluation findings into practice is essential for programs continuously improve and remain responsive to the needs of families and staff. CQI provides a structured, data-driven framework to do just that—turning abstract insights into tested, measurable improvements. Success, however, depends on more than tools and strategies; it requires strong partnership, a culture of inquiry, and intentional alignment between evaluation and improvement efforts.
Reference
Morris, Z. S., Wooding, S., & Grant, J. (2011). The answer is 17 years, what is the question: Understanding time lags in translational research. Journal of the Royal Society of Medicine, 104(12), 510–520. https://doi.org/10.1258/jrsm.2011.110180
Additional Resources
Reynolds, S., & Bradie, B. (2023). Implementation science toolkit for clinicians: Improving adoption of evidence in practice. Dimensions of Critical Care Nursing, 42(1), 33–41. https://doi.org/10.1097/DCC.0000000000000556
Leeman, J., Birken, S. A., Powell, B. J., Rohweder, C., & Shea, C. N. (2017). Beyond “implementation strategies:” Classifying the full range of strategies used in implementation science and practice. Implementation Science, 12(1). https://doi.org/10.1186/s13012-017-0657-x
Dunst, C. J., Trivette, C. M., & Cutspec, P. A. (2002). Toward an operational definition of evidence-based practices. Centerscope, 1(1), 1–10.
Related Resources
For a practice-based perspective on translating findings into improvement, see Ask an Evaluator: Turning Evaluation Findings into Action, which explores how evaluators and program staff can work together to make evidence actionable.
To learn more about partnering with JBA, visit our Working With Us page.