Refining the Professional Practice Model Within a Health System: Adaptation


CHAPTER 7






Refining the Professional Practice Model Within a Health System: Adaptation


 





KEY WORDS






Adaptation, refining, adapting a professional practice model (PPM), microsystems, reflection


 





OBJECTIVES






By the end of this chapter, readers will be able to:


1.  Evaluate the process of using evidence to refine and adapt a professional practice model (PPM) for local use


2.  Analyze leadership’s role in engaging clinical microsystems


3.  Describe how the cycle of action and reflection contributes to rapid integration of PPMs


REFINING AND ADAPTING PROFESSIONAL PRACTICE MODEL IMPLEMENTATION BASED ON EVIDENCE


Using the best available evidence to inform the implementation process takes the subjectivity out of decision making, allowing the ensuing processes to evolve out of the data rather than on individual opinions.







In a project such as the integration of a professional practice model, summative data are crucial but insufficient to meet the periodic needs of end users for information that speaks to feasibility in the real world, to determine the potential influence of contextual factors, to gauge how the project participants (patients and nurses) respond, and to formulate refinements and adaptations necessary to achieve optimal integration.






Using formative evidence helps to address potential implementation weaknesses. For example, failure to implement a strategy as planned or to concurrently implement some other system-wide initiative may create unintended barriers that impede short-term goal attainment. Using formative evaluation data this way helps to identify discrepancies between the plan and how it was operationalized by identifying influences that may not have been anticipated at first. As Hulscher, Laurant, and Grol (2003) note, formative evaluation data allows the evaluation and measurement of the actual exposure to the strategy, describes the experience of those exposed, and focuses on the dynamic context within which implementation is taking place. Finally, formative evaluation data provides information to communicate to stakeholders, allowing the project to better “tell its story” during implementation rather than waiting until the end.


Despite the fact that modifications and adaptations are common during large-scale change projects, using data to make judgments about how the implementation process works in practice is not necessarily commonplace. Making decisions in health care, especially about professional practice, is complex and awkward to say the least. Most decisions have important consequences and involve much ambiguity, sometimes leading to disagreements, or worse, the abandonment of a course of action. In general, large-scale projects and their outcomes are enhanced by responding to various types of data that lead to informed decisions about ongoing implementation strategies, including refining or adapting them to meet the requirements of the local environment. Using data in this way facilitates integration by efficiently enabling team members to assess progress toward goals, respond to stakeholder and organizational needs, reallocate resources, and enhance original strategies. Although the original strategies provided the impetus to initiate the integration, ongoing evidence helps to improve the implementation process—by refining and adapting strategies as they are applied in varied contexts. Through this process, the implementation committee “learns” about how the strategies are received, accepted, used, and whether the stakeholders’ expectations were met. In essence, the adaptation process generates new application data that, if used appropriately, informs the ongoing project.







The goal of using the refining implementation strategies is to improve or perfect them so that any risks or barriers can be acknowledged and addressed to enhance the quality and efficiency of project outcomes. Adapting implementation strategies, on the other hand, refers to making informed decisions to change, alter, incorporate new ideas, or even abandon original implementation strategies.






This decision-making process is enhanced when evaluation data is applied because its objectivity allows for more straightforward selection of alternatives, stimulation of new ideas or lines of questioning, elimination of preconceived barriers such as unique patient populations, and engenders a sense of community around the professional practice model (PPM) through ongoing organizational learning.


Adapting implementation strategies is a dynamic and participatory process. It must preserve the integrity of the PPM despite differences in local circumstances that may legitimately require important variations. In adapting a particular strategy, consideration is given to local situations, such as specific patient needs, priorities, policies, and resources; to scopes of practice within the local system; and the fit within existing models of care delivery in the targeted setting. For example, in the care of children and families, family-centered care is a common overriding conceptual framework. Adapting the selected PPM to fit within this unique framework is necessary to improve uptake of the PPM.







Being able to refine and adapt to local differences requires a willingness to critically evaluate ideas and performance as individuals and as teams.






Effective implementation committees must respond to feedback about the implementation process itself in terms of how well it is being accepted and used by nurses, how well it is progressing, how well the team is collaborating, and how the context is facilitating or hindering the plan. Implementation committees also need to consider the project outcomes in terms of their quality and from the stakeholders’ perspective. Asking questions, such as the following, is necessary:



  Is value being delivered?


  Are implementation strategies feasible in the real world?


  Is integration of a reliable, adaptable PPM steadily progressing?


  Is the team working well together?


  How do organizational attributes contribute to or detract from the implementation plan?


By regularly consolidating and combining evaluation data with their understanding of the situation (i.e., insights regarding particular units and their patient populations) the implementation committee members, particularly nurse leaders, are key to this process because of the information they supply. Such information can then become actionable knowledge (as it is synthesized and judged according to its merits and yields possible alternatives) that is used to make decisions (adapt) about implementation strategies. Such decisions can take the form of remediation, tailoring implementation strategies to meet individual units’ needs (such as increased education or unit-specific processes), increasing or decreasing the involvement of staff members and patients in the process, setting new objectives, and identifying areas where nurses need to strengthen their own knowledge or skills. Feedback that is acted on in each of these areas—at the end of each evaluation period and at the end of the project—helps the team effectively refine strategies and adapt to needed changes imposed by units, individuals, and patients, thereby shaping optimal integration (Figure 7.1).


Considering evaluation data in this manner requires some agility—the ability to see stakeholder value as the goal versus the implementation process itself.







Although implementation and evaluation are typically perceived as static, the context in which they function is dynamic and complex. Thus, project agility is necessary to implement strategies, to explore evaluation data, to mindfully consider this data in light of system knowledge, and to contemplate multiple alternatives to improve strategies or make actionable changes. Project agility reminds us that implementation strategies, although important, are not untouchable.






Implementation strategies are meant to be guides, allowing for some uncertainty, and should be elastic enough to permit refinement and adaptive actions such that situational or changing requirements (including correcting the process) can easily take place. Implementation committees then, continuously refine and adapt implementation processes, remaining true to the ultimate integration goals. Agile committees embrace and respond to contextual variations, are flexible and efficient, and focus on the end user.


images


Figure 7.1 Process of using data to refine and adapt the implementation plan.







Courage (to explore data, which is often an untested skill) and humility (to recognize mistakes, to improve original ideas, and alter strategies based on situations) are attributes of agile project teams.






Although such implementation committees use data to refine and adapt implementation plans, several organizational factors can impede this process. First, the accuracy and accessibility of data, together with the technical support or evaluative skill of members, can affect the implementation teams’ ability to turn data into valid information and actionable knowledge. Without the availability of high-quality data and perhaps evaluation consultation assistance, data may become misinformation or lead to invalid inferences. As an example of the former, data from a questionnaire that was poorly administered (yielding low response rates) on a particular unit might misinform team members about patients’ satisfaction with the PPM. An example of the latter includes incomplete understanding of the statistics used in interpreting the evaluation data, leading to the erroneous interpretation that nonsignificant changes in pre–post-test scores were meaningful indicators.


Second, the process of refinement and adaptation is not necessarily as clean or easy as Figure 7.1 depicts. Administrative pressures and internal motivation contribute to tension about evaluative data use. For example, system policies for reporting of results, as well as rewards and sanctions based on performance, create incentives and pressures to examine and use particular data, especially those related to resource use. Furthermore, the intrinsic desire to evaluate and improve individual performance may contribute to data use. For example, an implementation committee member may be inspired by the evaluation feedback and become motivated to pursue it. Contrarily, another committee member may feel overwhelmed by the data, diminishing his or her enthusiasm for applying it.


Third, the timeliness of data, particularly delays associated with receiving results, affects committee members’ ability to use the information for decision making. In contrast, the immediacy of results may enable their use throughout a project. The availability of evaluation data results at multiple points in time also enhances their utility relative to project objectives.


Fourth, the individual capacity and associated support available to committee members in terms of preparation and skill in data interpretation, formulating questions, developing solutions, access to professional development, and support from individuals who are skilled in sorting data enable data use. Obviously, lack of time to synthesize and interpret data also limits data use. Deciding how to act on implementation results requires time that most nurses do not have and few health systems allocate. Permitting protected time to regularly examine and reflect on implementation data is often missing from integration projects such as this.


Fifth, the culture and leadership within a health system also influences patterns of data use. For example, leaders with strong commitments to data-driven decision making as well as norms of openness and collaboration foster data use. On the other hand, in settings where beliefs about project feedback foster privacy, the collective examination and use of data is constrained. Many leadership implications exist as a result of these factors such as:



  The presence of data does not guarantee they will be used to drive refinement or adaptation of the implementation processes.


  Using various types of data collected at multiple points in time promotes informed decisions.


  Equal attention must be paid to data analysis and data decision making. These are two different behaviors, and decision making based on data is often more challenging, requiring more creativity and more time.


  High-quality data that is timely and accessible is necessary to create user enthusiasm.


  Internal motivation and system incentives may help or hinder data-driven decisions.


  Encouragement (capacity building) and technological support to aid in data use must be provided.


  A culture of inquiry must be fostered (Melnyk, Fineout-Overholt, Stillwell, & Williamson, 2009) in which nurses actively question nursing practice in a safe and encouraging atmosphere, the pursuit of best evidence is routinely expected, nurses are knowledgeable about and committed to the use of data for decision making, diversity of ideas is appreciated, practice improvement is part of the routine, and opportunities for reflective dialogue are provided (Duffy et al., 2015).







In an integration project of this nature where ultimate practice change is the goal, comprehensive approaches are required at different levels in the system to adapt to new ways of thinking about and practicing nursing.






Such comprehensive approaches to implementation projects are difficult to accomplish, even when there is good evidence to support them; thus, creating the environment to receive and then act on evidence is essential. Leaders play a crucial role in this process and often are active participants in the successful implementation, refinement, and adaptation at the bedside.


Foremost is leadership’s comfort with and use of evaluation data to guide decision making. At times, this requires additional education and the learning of new skills, the letting go of old ways, and accepting enabling resources such as consultation, training, and instrumental assistance. For example, working directly with a statistician or evaluation consultant during examination of data can aid in accurate interpretation of the data.







Bringing evaluation data to meetings, focusing the committee on the data rather than on opinion, asking questions, encouraging team members to create solutions based on the end users’ perspectives are all important leadership functions.






This last point is crucial. Adapting PPM implementation strategies for use in the perioperative area may be quite different from those used in the neonatal intensive care unit.


May 30, 2017 | Posted by in NURSING | Comments Off on Refining the Professional Practice Model Within a Health System: Adaptation

Full access? Get Clinical Tree

Get Clinical Tree app for offline access