Capabilities

Program Evaluation

Our mixed-methods evaluation designs incorporate both quantitative and qualitative data to produce rich findings. Designs may include surveys, case studies, small sample studies, document content analysis, secondary data analysis, structured observations, case record abstractions, and more. Qualitative methods such as interviews and focus groups allow us to probe beneath the surface to answer questions that cannot be addressed with quantitative methods alone.

Evaluability Assessment

We conduct evaluability assessments to help clients understand programs—their objectives, processes, and indicators of successful performance. Using logic models and theories of change, we identify measurable goals and map the relationships between activities, outputs, and outcomes. We assess key evaluability areas including service design, information and data availability, and organizational context against objective performance measures. The results indicate whether a program has plausible objectives, is suitable for in-depth evaluation, and would meet the standards of leaders and policy makers.

Study Design

We help clients answer key questions using mixed methods that address their unique needs and circumstances. We customize each study design based on contextual factors such as timeline, resources, availability of baseline data and other sources, data collection issues, and potential for creating a control or comparison group.

Implementation Evaluation

Implementation evaluation bridges the gap between design and implementation, revealing how a program in practice may deviate from a model and why it succeeds or fails in achieving intended outcomes. We consider the context in which programs operate and how different settings influence implementation and results. The findings highlight differences in outcomes between sites in multisite evaluations, ensure internal validity, and help prevent errors. Implementation evaluation may also be used to examine and refine new programs or to identify program components that would benefit from modifications or additional training.

Outcome/Impact Evaluation

We design and conduct evaluations that measure impact on participants, communities, or even providers. The results demonstrate whether a program is accomplishing its intended results. Activities range from developing a theory of change and logic model to creating an evaluation plan to collecting and analyzing data and disseminating findings. We employ longitudinal studies to measure impact over time.

Applied Research

Applied research finds real-world solutions to problems. Our applied social and behavioral research activities include customized study design, data collection, data quality assurance and management, statistical analysis, and knowledge dissemination. Our specialized and multidisciplinary content experts work on flexible teams to deliver practical, actionable new knowledge.

Selected Resources

Selected Projects