The promise of hybrid designs for the evaluation of implementation efforts




Poster session 4 Saturday: Evidence implementation and evaluation


Saturday 16 September 2017 - 12:30 to 14:00


All authors in correct order:

Mildon R1, Shlonsky A2
1 Centre for Evidence and Implementation, Australia
2 University of Melbourne School of Health Sciences, Australia
Presenting author and contact person

Presenting author:

Paul Ronalds

Contact person:

Abstract text
Background: The gap between the promise of empirically supported interventions and their successful implementation in the real world persists in a wide variety of contexts. Implementation science (IS) plays an important role in identifying barriers to address these gaps in the translation of evidence into policy and programmes. Yet there is only a beginning level of high-quality, empirically valid research into which implementation strategies are effective and can be used to achieve more rapid transitional gains in the uptake and use of these interventions. Improved methods are needed for the design and evaluation of implementation efforts in real world programme delivery.

Objectives and Methods: This paper will outline methods for better testing dissemination and implementation strategies by: 1) briefly outlining common theories and frameworks in Implementation Science that serve as conceptual guides for evaluation design 2) propose a number of 'hybrid' effectiveness-implementation evaluation designs that can be applied to these; 3) outline the design decisions needed to select these; and, 4) providing real-world examples.

Results: The common components of frequently used and well-published Implementation Science frameworks will we presented, and this will be coupled with proposed hybrid evaluation designs that test both intervention outcomes and implementation interventions/strategies. These hybrid designs are intended for use in real-world settings, testing implementation strategies while observing and gathering information on the intervention’s impact on relevant outcomes.

Conclusions: Hybrid evaluation designs for testing dissemination and implementation strategies have the potential to speed the translation of high-quality evidence into routine practice.