Health interventions that demonstrate effectiveness (also called evidence‐based interventions) have to be integrated in practice in order to benefit clients experiencing the health problem. Implementation is the actively planned and deliberately initiated effort to promote the uptake or adoption and use of evidence‐based interventions in real‐world practice (Pfadenhaver et al., 2017). Implementation must be carefully planned and well executed to facilitate health professionals’ adoption and delivery of evidence‐based interventions, and consequently clients’ engagement and enactment of treatment; the ultimate goal is improvement in clients’ experience of the health problem, general health, and well‐being. To this end, the implementation plan should account for a range of factors that are inherent in the real world and that influence the actual use of evidence‐based interventions in practice. Knowledge of the most influential factors informs the selection of strategies to disseminate evidence‐based interventions to health professionals and to support them in the delivery of these interventions to clients. The implementation plan is executed in a multistage process that actively involves different stakeholder groups, including clients, health professionals, and decision‐makers. The success of the implementation initiative is evaluated with designs that enable collection of data, with different methods, from multiple sources, on the factors that affect the use of the evidence‐based interventions, on the fidelity with which the interventions are delivered and on the outcomes achieved within practice. Implementation initiatives are informed by frameworks that provide a useful structure for planning, executing, and evaluating them. In this chapter, implementation frameworks are briefly reviewed. Practical guidance for applying the implementation process is provided. Research designs for evaluating the success of the implementation initiatives are highlighted. Recognizing the importance of theory in guiding implementation initiatives, a large number of implementation frameworks have been, and still are being, developed. Examples of frameworks include: Practical Robust Implementation and Sustainability Model (PRISM), Consolidated Framework for Implementation Research, Normalization Process Theory, General Theory of Implementation, Diffusion of Innovation Theory, the Ottawa Model, Behavior Change Model, and Promoting Action on Research Implementation in Health Services. Recent reviews of implementation frameworks (Harris et al., 2017; Moullin et al., 2015; Nilsen, 2015) classified the frameworks into three main categories: determinants, process, and evaluation. Each category consists of frameworks that emphasize one aspect of implementation more so than the remaining aspects. Determinants frameworks identify the range of factors, operating at different levels within the healthcare system, that may influence the adoption and implementation of evidence‐based interventions, as well as the effectiveness of these interventions in practice (Damschroder, 2020). The factors represent a set of characteristics, inherent in the context of implementation, that enable (i.e. facilitators) or hinder (i.e. barriers) the use of evidence‐based interventions in practice. The factors manifest at the levels of: Clients seen in practice may experience the health problem and present with sociodemographic and health characteristics in a way that differs, to various extent, from those reported for clients who participated in research studies that demonstrated the effectiveness of the interventions. The differences in the experience of the health problem and characteristics between the two cohorts of clients raise concerns about the applicability of the evidence, generated in research, to practice (as mentioned in Chapter 14). The limited relevance of evidence to practice can potentially influence the adoption of the evidence‐based intervention in practice. Clients seen in practice may be aware of the evidence‐based intervention and its benefits. Therefore, they demand or seek it, which motivates the implementation initiatives aimed to make the evidence‐based intervention available and accessible in the local practice. Alternatively, clients may perceive an evidence‐based intervention as unacceptable, and consequently decline its uptake when offered in the local practice setting. Process frameworks specify the phases or stages and steps to plan and execute the implementation initiatives (Damschroder, 2020; Nilsen, 2015). The process aims to facilitate change in practice (i.e. adoption and use of evidence‐based interventions) at the individual (i.e. health professionals) and the collective (i.e. healthcare team members, support staff, decision‐makers) levels (Hartveit et al., 2019). Despite some variability in its description, the implementation process comprises two main stages: pre‐implementation and implementation (Moullin et al., 2015). Recent emphasis on the importance of accounting for the features of the local context in which the evidence‐based intervention is implemented, contributed to the embracement of a collaborative process. In a collaborative process, stakeholder groups encompassing clients, health professionals, and decision‐makers are engaged in preparing for and monitoring the implementation of evidence‐based interventions (Aarons et al., 2017; Glandon et al., 2017; Pfadenhaver et al., 2017; Wandersman et al., 2016). The steps of this collaborative approach are reviewed next, and practice guidance for carrying them out is presented in Section 16.2. The pre‐implementation phase focuses on the preparation for the implementation of an evidence‐based intervention. The preparation involves steps that (1) explore the local practice need for change in practice and local stakeholder groups’ views on the evidence‐based intervention; (2) determine the need for adapting the evidence‐based intervention to the local practice context, and engage stakeholder groups in the adaptation of the evidence‐based intervention; (3) assess for facilitators and barriers to implementation; and (4) engage local stakeholder groups in the selection of strategies or techniques to facilitate implementation. Adaptation of the evidence‐based intervention is advocated to improve its relevance and applicability to the local practice context. The adaptation promotes favorable perceptions and buy‐in of the evidence‐based intervention by local stakeholder groups, which enhances the adoption of the intervention (Aarons et al., 2017; Bach‐Mortensen et al., 2018; Harvey & Gumport, 2015). Choosing training and support strategies that meet health professionals’ learning needs and style is expected to build their personal competence in implementing the evidence‐based intervention. Selecting techniques that address the specific (to the local context) facilitators and barriers are anticipated to promote collective implementation of the intervention. The end result is implementation of the evidence‐based intervention with flexible fidelity (see Chapter 9). The implementation stage involves the actual roll‐out of the plan. This entails: providing the training and support to health professionals responsible for implementing the evidence‐based intervention; carrying out other implementation strategies (e.g. changes in policy); and monitoring the delivery of the evidence‐based intervention in practice (Nilsen, 2015; Pfadenhaver et al., 2017). A process evaluation is conducted to monitor the delivery of the evidence‐based intervention and identify challenges in implementation, which should be appropriately and promptly addressed. Evaluation frameworks delineate the outcomes expected of the implementation initiatives. The outcomes form the basis for designing research studies to evaluate the success of the implementation initiatives (Damschroder, 2020; Nilsen, 2015). Generally, two sets of outcomes are specified. The first set entails outcomes expected as a result of the implementation strategies aimed to promote the adoption of the evidence‐based intervention in practice. These outcomes manifest as changes in health professionals’ perceptions (e.g. acceptability) of the evidence‐based intervention, knowledge, and competence in providing the intervention, adoption (i.e. actual use of the intervention in practice), and appropriate delivery (i.e. with fidelity) of the intervention in daily practice. Changes in these outcomes take place and are assessed at the health professionals’ individual level and at the collective level. The collective level is often represented in the number or percentage of health professionals implementing the evidence‐based intervention within the local practice setting. The second set of outcomes reflects the benefits, to clients, anticipated of the evidence‐based intervention. They include improvement in the clients’ experience of the health problem, general health, and well‐being. These outcomes are assessed at the individual client level and may be reported at the practice level as the percentage of clients exhibiting improvement (Fixsen et al., 2019a; Procter et al., 2011). Although implementation frameworks are categorized as focusing on determinants, processes, or outcomes, there is overlap in the categories; that is, some frameworks cover combinations of determinants, processes and outcomes. Selecting one particular framework may not provide a full or comprehensive understanding of the complexity of implementation, thereby limiting the careful planning, conduct and evaluation of the implementation initiatives (Nilsen, 2015). Therefore, it is advisable to develop, in collaboration with stakeholder groups, a logic model that specifies the determinants, processes and outcomes of relevance to the local practice context of implementation. As suggested by Smith et al. (2020), the implementation logic model identifies: The implementation logic model focuses on the implementation strategies. It should be supplemented with the logic model of the evidence‐based intervention to comprehensively evaluate the success of the implementation initiatives in improving clients’ experiences. The intervention’s logic model operationalizes the theory underlying the intervention as described in Chapter 5. The logic model guides the specification of client outcomes (i.e. second set of outcomes as mentioned in Section 16.1.3) to be assessed in the implementation evaluation initiatives. The logic model also informs the design and conduct of the process and outcome evaluation of the evidence‐based intervention, which are embedded within the implementation evaluation project, forming a hybrid or dual evaluation of the implementation strategies and of the evidence‐based intervention.
CHAPTER 16
Frameworks and Methods for Implementing Interventions
16.1 IMPLEMENTATION FRAMEWORKS
16.1.1 Determinants Frameworks
16.1.2 Process Frameworks
16.1.2.1 Stage 1—Pre‐implementation
16.1.2.2 Stage 2—Implementation
16.1.3 Evaluation Frameworks
16.1.4 Selection of a Framework
16.2 GUIDANCE FOR APPLYING THE IMPLEMENTATION PROCESS