Selecting, adapting, and sustaining programs in health care systems
Authors Zullig L, Bosworth H
Received 30 December 2014
Accepted for publication 24 February 2015
Published 16 April 2015 Volume 2015:8 Pages 199—203
Checked for plagiarism Yes
Review by Single-blind
Peer reviewer comments 2
Editor who approved publication: Dr Scott Fraser
Leah L Zullig,1,2 Hayden B Bosworth1–4
1Center for Health Services Research in Primary Care, Durham Veterans Affairs Medical Center, Durham, NC, USA; 2Department of Medicine, Duke University Medical Center, Durham, NC, USA; 3School of Nursing, 4Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC, USA
Abstract: Practitioners and researchers often design behavioral programs that are effective for a specific population or problem. Despite their success in a controlled setting, relatively few programs are scaled up and implemented in health care systems. Planning for scale-up is a critical, yet often overlooked, element in the process of program design. Equally as important is understanding how to select a program that has already been developed, and adapt and implement the program to meet specific organizational goals. This adaptation and implementation requires attention to organizational goals, available resources, and program cost. We assert that translational behavioral medicine necessitates expanding successful programs beyond a stand-alone research study. This paper describes key factors to consider when selecting, adapting, and sustaining programs for scale-up in large health care systems and applies the Knowledge to Action (KTA) Framework to a case study, illustrating knowledge creation and an action cycle of implementation and evaluation activities.
Keywords: program sustainability, diffusion of innovation, information dissemination, health services research, intervention studies
Clinicians and health services researchers often design programs to address specific health problems. Despite evidence that a program may be effective, many programs become one-time, time-limited interventions. Even among interventions that are disseminated, the transfer of research findings into clinical practice is often a slow and haphazard process.1,2 This minimal translation into practice may be attributed to lack of planning for future scalability. Scaling up a successful program and broadly implementing it (or translating knowledge into action), perhaps in the context of a health care system, could maximize potential impacts on individual and population health.
We assert that translational behavioral medicine necessitates expanding successful programs beyond a stand-alone research study. Innovative programs must be intentionally developed with future scale-up and implementation potential in heterogeneous organizations. We discuss developing scalable programs, selecting an existing program that meets an organization’s needs, adapting an existing program to fit the unique organizational culture and patient populations, and sustaining interventions long-term. Once an intervention has been developed, translating it into the field (eg, turning knowledge into action) is a critical, yet often overlooked, step.1 We present an antihypertensive medication adherence and patient self-monitoring intervention as a case study because it represents two complex, but common, required behaviors.
The Knowledge to Action (KTA) Framework was developed based on a review of 31 planned action theories with the goal of providing a framework for thinking about the process and integration of knowledge creation and knowledge application.1,3 Stated differently, the theory outlines a process for exchanging knowledge between relevant stakeholders in a way that results in action.1 The KTA Framework is comprised of two components: knowledge creation and an action cycle.3 Each component has multiple phases. For example, knowledge creation encompasses knowledge inquiry, synthesis, and products/tools.1,3,4 A recent literature review concluded that the KTA Framework is used in practice; many more studies have applied the action cycle rather than integrating the framework in its entirety.3 In the context of the KTA Framework, we present a case study demonstrating how it is possible to bridge the gap between a research intervention study (ie, knowledge creation) and action (ie, implementation in a health care system).
Scalability involves expanding a program that has been demonstrated as being efficacious on a controlled, small scale and implementing it under real world conditions with the goal of reaching a larger population.5 The potential for scalability is important to consider when developing a new program or selecting existing programs for broader implementation. In making the transition to a broad rollout, it is critical to first assess whether the program is worthy of scaling up; how effective is it at achieving the targeted behavioral change? There may be problems with adopting an intervention before it has been clearly demonstrated to be advantageous for patients.1 If an intervention is adopted prior to its benefits being verified, it is possible that patients may be exposed to ineffective or potentially harmful treatments.1,6 Even safe and effective interventions may require modification for scale-up. Most effectiveness trials involve samples of 300–600 people. What needs to be altered when scaling a program for 3,000 or 6,000 individuals? A program must not only accomplish the desired behavioral change, such as improving medication adherence, but preferably that change must also be maintained within an individual over time (Table 1).
Table 1 Determining appropriateness for scale-up
Next, it is important to consider the feasibility of scaling the program, given the resources required to implement and sustain it. These resources could include human, organizational, and technological resources as well as physical space. Contextual factors, such as the organization’s readiness to change, must also be taken into account. To ensure successful scale-up, addressing both resource needs and contextual factors during the planning stages of the scale-up process is imperative.
Cost is arguably one of the most important factors to consider when scaling-up a program, yet many programs fail to adequately evaluate implementation cost. What are the per patient intervention costs? Is the program cost-effective? As the intervention is scaled up, how do costs change? Even if the program is more cost-effective than standard clinical care, the health care system must be able to afford it. Finally, there must be a program evaluation plan. Evaluating the program must encompass a cost-effectiveness and clinical outcomes assessment.
Scalable programs must be designed with a goal of simplicity. Behavioral interventions often target complex problems with multiple health determinants, necessitating multifaceted solutions. There is inevitable tension between ensuring a program is robust while balancing feasibility constraints. Complex interventions can be difficult to scale-up and sustain, whereas less complex interventions tend to be less resource intensive and easier to scale-up and sustain. Whenever possible, using simplicity in design will increase the likelihood of maintaining fidelity.
Allowances must be made for flexibility and changing course when appropriate (Table 2). One way to do this is through adaptive design, in which there are planned opportunities to evaluate a program while it is ongoing and make changes to one or more specified design aspects.7 Although improper adaptations can lead to biased studies, when used properly the benefits of adaptive design may include a smaller sample size, more efficient treatment development process, and an increased chance of correctly answering the clinical question of interest.7 Regardless of the study approach taken, gaining and maintaining stakeholder interest is another important element for ensuring sustainability. Identifying clinical and organizational leaders early on can help ensure that they share in the process and develop a sense of ownership.
Table 2 Allowing for flexibility in design
Often because of time and resource constraints, it may be preferable to use an existing program rather than crafting one from scratch. In terms of the KTA Framework, this may result in limiting time spent in the knowledge creation stage and instead focusing on the action cycle.1 When choosing an existing program, it is important to consider the program match, quality, and organizational resources.8 Regarding program match, it is important to gauge how well the program’s goals and objectives match that of the implementing organization and its culture.8 Is the program complementary with others offered by the organization? Regarding quality, the intervention should be based on scientific evidence including robust clinical and outcomes evaluation.8 Regardless of how attractive a program may appear, an organization must have adequately trained and available staff, financial support, physical space, and leadership support in order for it to be successful.
It is often necessary to adapt programs for unique situations. Adaptation involves modifying an existing program to make it more suitable for a particular population or to better fit with an organization’s capacity and needs without compromising its integrity.9 In the KTA Framework, this is conceptualized throughout the action cycle. For example, two phases of the action cycle are: 1) identify, review, and select knowledge; and 2) adapt knowledge to local context.1,4 Adaptation may be merited when applying a program to a new organization or community, to better fit with resources or budgetary constraints, or to better fit with local preferences or culture.9 When making adaptations, a balance must be struck between maintaining the fidelity of the original intervention while being flexible enough to meet the current needs and setting. As a starting point for successful adaptation, the core components of the original intervention must be acknowledged. Core components are essentially the active ingredients. Core components may be classified as the content of the intervention, the pedagogy of how that content is delivered, and the logistics of the implementation or delivery of the content.9 Because adaptation may apply an intervention to a more heterogeneous population, it may also increase the program’s external validity (Table 3).
Table 3 Adapting interventions
Regardless of whether an organization is developing or identifying and/or adapting an existing program, the sustainability of the program must be evaluated. Sustainability can be thought of in several different dimensions: program sustainability over time or maintaining the improvement in outcome. Because interventions occur in complex societal systems, sustaining an intervention may require action at many levels ranging from knowledge use, to individual change, to community engagement, to institutional change.1,10 There are several elements that may predict program sustainability. Whelan et al11 assert that these elements include: planning, gathering relevant evidence, seeking commitment and support, developing partnerships, identifying program champions, building capacity, embedding into core policy, evaluating effectiveness and outcomes, evolving and adapting, and securing funding (Table 4).
Table 4 Sustaining interventions
While many effective programs are developed, relatively few successfully transition from the research setting to real world clinical practice. Using the KTA Framework as a guide, we assert that with thoughtful planning it is possible to implement and sustain interventions in clinical practice. This will require designing or selecting and adapting interventions with the potential for future scale-up. Interventions must be reasonable to implement from a resource and cost perspective. Garnishing stakeholder support and having plans for long-term funding are among the critical elements to sustaining an intervention over time (Figure 1). This is critical not only for advancing the field, but also in order to see interventions having a lasting impact on patient care.
Dr Bosworth was supported by a research career scientist award (VA HSR&D 08-027). Dr Zullig was supported by Veterans Affairs (VA) Health Services Research and Development (HSR&D) Career Development Award (CDA 13-025).
The authors have no conflicts of interest to disclose. The views expressed in this article are those of the author(s) and do not necessarily represent the views of the Department of Veterans Affairs.
Graham ID, Logan J, Harrison MB, et al. Lost in knowledge translation: time for a map? J Contin Educ Health Prof. 2006;26(1):13–24.
Agency for Healthcare Research and Quality. Accelerating Change and Transformation in Organizations and Networks II. March 2013. Available from: http://www.ahrq.gov/cpi/initiatives/ACTION_II/index.html. Accessed February 5, 2015.
Field B, Booth A, Ilott I, Gerrish K. Using the knowledge to action framework in practice: a citation analysis and systematic review. Implement Sci. 2014;9(1):172.
Graham ID, Tetroe J. Some theoretical underpinnings of knowledge translation. Acad Emerg Med. 2007;14(11):936–941.
Milat AJ, King L, Bauman AE, Redman S. The concept of scalability: increasing the scale and potential adoption of health promotion interventions into policy and practice. Health Promot Int. 2013;28(3):285–298.
Roumie CL, Arbogast PG, Mitchel EF Jr, Griffin MR. Prescriptions for chronic high-dose cyclooxygenase-2 inhibitors are often inappropriate and potentially dangerous. J Gen Intern Med. 2005;20(10):879–883.
Kairalla JA, Coffey CS, Thomann MA, Muller KE. Adaptive trial designs: a review of barriers and opportunities. Trials. 2012;13:145.
Small S, Cooney S, Eastman G, O’Conner C. Guidelines for selecting an evidence-based program: balancing community needs, program quality, and organizational resources. What Works – Wisconsin Research to Practice Series. 2007; Number 3.
Family and Youth Services Bureau. Making Adaptations Tip Sheet. July 15, 2011. Available from: http://www.acf.hhs.gov/sites/default/files/fysb/prep-making-adaptations-ts.pdf. Accessed October 8, 2014.
Swerissen H, Crisp BR. The sustainability of health promotion interventions for different levels of social organization. Health Promot Int. 2004;19(1):123–130.
Whelan J, Love P, Pettman T, et al. Cochrane update: predicting sustainability of intervention effects in public health evidence: identifying key elements to provide guidance. J Publ Health (Oxf). 2014;36(2):347–351.
This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.Download Article [PDF]