Back to Journals » Substance Abuse and Rehabilitation » Volume 14

Adapting, Implementing, and Maintaining a Group Cognitive Behavioral Therapy Program at an Inpatient Addiction Treatment Facility

Authors Bourdon JL , Judson S, Caporaso G, Wright MF, Fields T, Vadhan NP, Morgenstern J 

Received 2 September 2023

Accepted for publication 25 October 2023

Published 6 November 2023 Volume 2023:14 Pages 119—130

DOI https://doi.org/10.2147/SAR.S433523

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Rajendra D. Badgaiyan



Jessica L Bourdon,1 Sidney Judson,1 Gabriella Caporaso,1 Monica F Wright,2 Taylor Fields,1 Nehal P Vadhan,3 Jon Morgenstern4

1Center for Addiction Science, Wellbridge Addiction Treatment and Research, Calverton, NY, USA; 2Zucker Hillside Hospital, Northwell Health, Glen Oaks, NY, USA; 3Institute of Behavioral Science, Feinstein Institutes for Medical Research, Department of Psychiatry, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempsted, NY, USA; 4Department of Psychiatry, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempsted, NY, USA

Correspondence: Jessica L Bourdon, Center for Addiction Science, Wellbridge Addiction Treatment and Research, 525 Jan Way, Calverton, NY, 11933, USA, Email [email protected]

Background: Quality training is an oft-cited barrier to effective implementation and ongoing delivery of high-quality evidence-based practice (EBP) across fields. This is especially true in the addiction field, but there is little cited evidence for optimal methods to improve EBP in inpatient addiction facilities with minimal resources.
Objective: The current paper focuses on evaluating the state of our facility’s group CBT manual and clinical training on the manual in a “realistic” (ie, non-RCT, non-grant-funded) inpatient addiction treatment setting.
Methods: Five full-time clinicians volunteered to take part in the study (woman = 60%; Mage = 36.20 years). The study involved a mix of semi-structured interviews and surveys designed to measure seven outcomes (barriers, feasibility, useability, appropriateness, acceptability, burden, trialability).
Results: Three themes emerged from the data that impacted the group CBT manual: training, timing, and functionality. Addressing these themes allowed for a new, optimal manual and training procedure to be put into place.
Conclusion: The current study highlights that under-resourced inpatient addiction facilities can still methodically utilize implementation approaches to study their EBP, namely CBT. Such an approach will ensure that the highest quality care is being delivered to patients and actively addresses known training barriers that prevent proper EBP delivery.

Plain Language Summary: Quality training is a barrier when implementing evidenced-based practice to treat substance use disorders.CBT is a manualized evidence-based practice that is performed at inpatient addiction facilities.An evaluation of the CBT manual was made by an inpatient addiction facility to identify barriers of implementation of EBPs.Identified barriers included training, timing, and functionality.

Keywords: cognitive behavioral therapy, addiction, evidence-based practice, inpatient, substance use disorders

Introduction

Dissemination and implementation science is an active, ongoing process that allows evidence-based practice (EBP) to be delivered at the highest quality possible.1,2 Research confirms, however, that there has been low dissemination of science-based knowledge, practice, and capacity into community-based practice settings across fields and content areas.3–6 It is suggested that one of the greatest contributors to this low level of uptake is the lack of proper training and education among healthcare professionals to administer EBP within healthcare facilities.5,7–12 Therefore, implementation alone is simply not enough to ensure that new practices are being adequately adopted into routine care.13 Consequently, dynamic, multi-pronged, multi-stakeholder implementation strategies are necessary to enhance adaptation and sustainment of a certain innovation.14

There is a need for sustained high-quality EBP in the field of addiction science.9,15,16 This field faces the aforementioned barriers of training and education of healthcare professionals to a heightened degree.9,11,17,18 Further, little information exists about how well various facilities manage the uptake and dissemination involved in implementation science, resulting in many known barriers but little documented information about the processes for identifying and overcoming said barriers.4 The current study is additionally necessary due to the high morbidity and mortality among those with substance use disorders (SUDs).15 Developing simple, implementation-focused methods that can be used by small inpatient facilities will help increase utilization of EBP, increase fidelity tracking of EBP, and result in overall better care for patients.4 Cognitive behavioral therapy (CBT) is a “gold standard” EBP for SUDs across various practice settings (eg, inpatient, outpatient).15,19–22 To date, most published studies examining the implementation of CBT into SUD practice settings are controlled experiments (eg, randomized controlled trials (RCT)) funded by grants and are rarely the result of the internal efforts of a community-based inpatient rehabilitation center.23–28 This is not to say that such efforts do not occur, but they do not appear to be widely published. Therefore, there is little extant literature on the implementation and fidelity of EBP into community-based facilities, specifically inpatient addiction treatment facilities.16

Study Purpose and Aims

Most extant literature involves structured research with time and resources dedicated to ideal training, fidelity monitoring, and EBP maintenances.16,21,23–27 The purpose of the current study was to examine the process, barriers, and reality of this process for a commonly used EBP, CBT for SUD, by outlining the process for optimizing an inpatient addiction center’s group CBT manual and training. In doing so, this study will highlight some of the external factors that naturally occur at an inpatient facility that affect implementation that are thus far rarely discussed in the literature. This study utilized both the Practical, Robust Implementation and Sustainability (PRISM) and Proctor models to measure outcomes of implementation as well as identify factors needed for successful implementation.29,30 This is the first part of a multi-stage project; later stages will focus on fidelity, the experiential aspect of delivering the group from a clinical and patient stakeholder perspective, and patient knowledge gained from the group. The sole aim of this paper was to evaluate and optimize the group CBT manual and make updates to it as well as improve the training process for the clinical team.

Methods

Participants

A total of 5 full-time clinicians agreed to take part in the study (woman = 60%; Mage = 36.20 years). All clinicians had worked at the facility for less than 2 years (6–11 months = 60%; < 6 months = 20%; 1–2 years = 20%), most had at least 3 years’ CBT training (3 years’ training = 60%; 1 year = 20%; 2 years = 20%), and all had varied practical experience with CBT (2 years’ experience = 40%; 1 year = 20%; 3 years = 20%; 5–7 years = 20%). No other demographic information was collected to protect participants’ identities due to the fact that this was not human subject research and also because this project was done in a small work environment.

Approach

A mixed-methods approach was utilized whereby participants engaged in a structured interview that lasted approximately 50 minutes and included a series of quantitative questions at the end that lasted approximately 5 minutes. Qualitative data were weighed more than quantitative data, following an embedded design.31 The same person interviewed each participant separately, and attempts were made to go in the same order for the qualitative questions. However, as participants took the answers in their own directions, the order of questions changed accordingly to improve flow of the interview. All data were collected via REDCap.32,33 The interviewer typed qualitative responses as they occurred on a laptop into the REDCap survey, and participants used the same laptop and survey to select their responses to the quantitative questions. The interviews lasted for 1 hour in total and were voluntary. All clinicians understood that the information that they shared would remain confidential, was for quality improvement purposes, would be summarized and shared with leadership in order to improve the CBT manual and group structure, and may be published. In total, there were 11 qualitative questions and 31 quantitative questions. Due to the internal education and quality control nature of the project, it did not constitute human subject research; ethics approval was not required.

Measures

There were seven outcomes, as outlined below. See Table 1 for the full list of measures and questions. The semi-structured interview was driven by seven outcomes (barriers, feasibility, useability, appropriateness, acceptability, burden, trialability) per Proctor et al and the PRISM model.29,30 Burden and useability fell under a “development” umbrella while feasibility, barriers, acceptability, and trialability fell under an “implementation” umbrella.

Table 1 Measures and Related Outcomes Included in This Project

Barriers are processes that hinder the implementation and use of the CBT manual and groups, and they were part of the qualitative section of the interview. These questions were created specifically for the project and included open-ended questions such as “Discuss three things working the best/worst within the CBT manual.”

Feasibility is how easily the manual is used and made up the majority of the qualitative section of the interview. These questions were also created for the project because an existing survey that was specific enough to this topic could not be found. Questions included asking clinicians how they used the manual, how much of the manual they delivered, and how helpful they found the manual.

Useability is how easily the CBT manual can be used, from access to design, and was part of the quantitative survey. Questions came from Shoemaker et al’s 2014 Patient Education Materials Assessment Tool (PEMAT) model.34 All questions were on a 2-point agree/disagree scale and covered content, word choice and style, use of numbers, organization, layout and design, and use of visual aids.

Appropriateness is whether the manual fits at the facility and was part of the quantitative survey, and acceptability is the ease with which clinicians incorporate the manual into their workflow. Questions for both outcomes came from Weiner et al’s 2017 model.35 There were five questions on a 5-point scale for each outcome.

Burden is the complexity of creating, maintaining, and monitoring the manual while trialability is how easily the manual can be tested and altered without disruption. Each outcome had 2 questions on a 5-point scale from Cook et al’s 2015 PCIS model.36

Analyses

Due to how few responses there were, Microsoft Excel was used to analyze both the quantitative survey responses and the qualitative data for simplicity.37–39 For the qualitative data, a content analysis approach was utilized whereby themes emerged from the data.40 Analytical steps involved open coding, categorizing codes, and conceptually ordering codes into themes. Multiple codes per response were allowed. Study staff engaged in constant comparison whereby codes and categories were checked throughout analysis to ensure appropriateness, and staff met frequently to ensure consensus on codes and categories. Quantitative data, including demographic breakdown, were analyzed separately in Excel via averages, sum scores, and histograms. The method depended on the type of question.

Results

Barriers

Across the 4 questions related to barriers, 15 categories were identified. Overall, reported strengths of the clinical program and CBT manual include: 1) the fact that evidence-based practice is a focus of the facility, 2) the manual is structured in a logical manner and includes the most essential topics for long-term recovery, 3) there is a well-rounded nature to those who work at the facility as well as groups that are offered, and 4) the manual does a good job identifying the most important take-home points from each topic.

Reported weaknesses of the clinical program and CBT manual include: 1) widespread program inconsistency, 2) a schedule that largely limits clinicians from conducting CBT with their own patients, 3) large group sizes, 4) a reportedly variable management of CBT groups. Further, there is a lack of clarity around the concept of CBT as well as specific topics, and take-home messages are often missing from the manual. Finally, clinical under-staffing affects the delivery of CBT and many of the aforementioned weaknesses.

Feasibility

Thirty-one categories were identified from these 7 questions. Overall, the feasibility (ie, useability) of the manual varies depending on the group. All participants reported creating their own activities and worksheets for at least one group, either because that resource did not exist in the manual or the one provided was insufficient. Even with such variability across groups, participants noted that it is usually easy to re-interpret the manual for patients and they usually find their own voice during this process. The manual provides a good foundation with useful rationale. They reported that their specific use percentage of the manual varies by topic, but three of the five participants reported 75% or higher compliance (note this is self-reported compliance). Thus, a major take-home point was to add more resources within the manual and organize certain sessions (see below).

Many participants were not trained on how to use the manual, and those who were trained were simply told to review the manual on their own with no follow-up. Only one participant reported having enough time at the facility to observe others’ CBT groups and reflect upon their own approach to CBT; this was a self-directed endeavor.

Useability

Useability was broken into several sections: total, content, word, numerical, organizational, layout, and visual. Total useability was in the moderate range, indicating more areas of strength than weakness. Specifically, sections where participants disagreed the most were, in order: visual, layout, organization, content, word, and numerical, with the latter having 100% agreement that it met relevant criteria. These responses indicate that the manual could improve specifically in the following (ordered by need): 1) using visual aids that reinforce content with clear titles; 2) using cues such as arrows, boxes, bullets, bold font, etc. to draw attention to key points; 3) breaking material into short sections or “chunks”, presenting information in a logical manner, utilizing headers, and providing a summary at the end of sections; 4) having a clearly stated purpose; and 5) using common, everyday language.

Appropriateness

Appropriateness had a single section, and participants rated it as a 4 out of 5, ie, “agreeable.” Areas that were rated the lowest and where the manual can improve include how “right” the CBT manual appears as well as how aligned it is with the goals of the facility.

Acceptability

Acceptability had a single section, and participants rated it as a 3.7 out of 5, ie, neutral-bordering-agreeable. Areas that were rated the lowest and where the manual can improve include participants approving of, liking, and not objecting to the manual as well as its general appeal.

Burden

Burden had a single section, and participants rated it as 3.75 out of 5, ie, neutral-bordering-agreeable. Areas that were rated the lowest and where the manual can improve include the ease of using the manual.

Trialability

Trialability had a single section, and participants rated it as a 4 out of 5, ie, “agreeable.” There were no clear areas for improvement in this section.

Manual Changes

The group CBT manual initially included 12 sessions (two of which were review), and there was no standardized training for new clinical staff. Staff were told to read the existing CBT manual and to observe colleagues’ groups as often as possible. Clinical supervision was not designed to discuss or train clinicians on CBT unless specific problems arose for which the clinicians needed guidance. Following the results from this project, the manual was changed to still include 12 sessions, but the review sessions were omitted and two sessions (Anger Management and Assertiveness and Communication Skills Training) were broken into two parts each. The manual was changed to address the specific needs noted above. An initial training and period refresher supervisions were also added to the clinical schedule. See Tables 2 and 3 for more details.

Table 2 Changes Made to the CBT Manual per Results in Aim 1 (Evaluation)

Table 3 Overview of the Updated Group CBT Manual

Discussion

The goal of the current study was to evaluate the group CBT manual for an inpatient addiction treatment facility and make appropriate updates to the manual. This will allow for future investigations of fidelity, clinicians’ delivery experiences, and patients’ knowledge gained from the group, all stages of this project that are planned for the near future. Overall, three themes emerged from the data that impacted the group CBT manual (training, timing, and functionality), and each will be discussed in turn. Finally, several changes have already been made to the CBT manual and clinical training which can be implemented by similar facilities if appropriate to improve the quality of their own clinical programming. This was all done in a real-world setting using established implementation models as guidelines with no internal or external funding and thus may be able to provide insight to other similar facilities.29,30

First, every participant mentioned the lack of training that they received on the CBT manual. This finding is supported by extant literature that yields similar results within facilities that treat SUDs21,41,42 as well as gaps in wider mental health and psychiatric training and education.7–11 It should be noted that some participants had more prior experience with group CBT manuals than others before leading groups, either at the current or past facilities, which led to a deeper understanding of CBT principles. Nevertheless, the lack of training caused difficulties adapting the needs and requirements of the manual. Following findings from this internal quality review, all new clinicians are given two weeks before being assigned patients or skills-based groups. The first two weeks are spent observing fellow clinicians, reading group manuals, and having an onboarding meeting designed to discuss group CBT. CBT trainings were also added to the clinical supervision time, first at a high intensity to carefully review the core concepts of CBT and group CBT manual changes, and more recently on a semi-monthly basis to maintain skill and knowledge among the clinical staff.

Timing is a multi-faceted theme. Participants commented on the lack of time to prepare for CBT sessions as well as not having enough time to thoroughly review certain topics within the allotted group time. This finding is not limited to this study, as lack of time has been recorded as one of the greatest barriers to implementation in prior research outside of SUD treatment.43–45 However, this is a theme not often published within the inpatient addition field. More research needs to be done on how to organize clinicians’ time in residential facilities, taking both the clinical stakeholder and organizational stakeholder levels into consideration. Options such as in-house time management trainings are one answer that may provide skills to clinicians on an individual level, but systematically examining the wider work culture must also be done. Another option is to be mindful to clinician burnout, as not having enough time to complete necessary tasks for patient care (ie, prep for CBT or complete the manual in full) is related to burnout and lack of control over one’s work culture.45–48 While the current study did not examine burnout, it is clear how the theme of timing relates to the ever-growing problem of burnout in clinical settings - inpatient, addiction, and otherwise. Much of this research has been conducted in addiction medical staff,48–51 and further work should examine the unique inpatient addiction setting among clinical staff as well.

The third repeated comment centered around the functionality of the manual itself. It is clear that the manual must be improved to be user-friendly for clinicians. For example, participants reported commonly having to interpret the meaning of topics themselves with little guidance, having to help patients understand worksheets / activities, and having to make their own handouts for CBT groups. The updated manual changed this by adding in a standard format to each session (pre-group needs, clinician flow and script, activity, homework), and the “flow / script” itself was made more consistent (check-in, background, key concepts and skills to be learned, activity and/or discussion, wrap up and assign homework). The activity for each session was redesigned to foster discussion and is now the guiding force behind the session. Additionally, an optional (but encouraged) PowerPoint was created with 2–4 slides per session. The slides are minimal with key words or definitions to guide clinicians and help them stay focused during the group.

Limitations

This study presented itself with several limitations. First, the facility in which the study took place is a small facility with limited staff members and high turnover. This reality meant that the potential sample size was limited to a few clinicians, and few clinicians were employed throughout the entire study. It also meant relying on an unpaid undergraduate summer intern for help with the qualitative analyses instead of more highly trained professionals. In addition, while this project had no internal or external funding, it was led by a non-clinician researcher. While the processes employed in this study could be done by any management-level staff, clinical or otherwise, we recognize that few other inpatient facilities employ a full-time researcher. Second, this study lacks the controllability that other studies focusing on implementation of EBP have obtained from formats such as randomized controlled trials. However, we argue that the real-world nature of this study is important and highlights struggles to adapt implementation principles to clinical settings. Relatedly, despite the amount of structure that was infused into this process, there are informal aspects of optimizing the CBT manual and training that were difficult to capture, such as a review of the manual by a co-author. Third, despite immense efforts of standardization within group CBT, there are natural occurrences within groups that cause patients to stray away from the topic. The current CBT manual offers little on behavior management or general group management techniques.

Future Directions

Studies such as the current one are important, as they further the conversation about how to carry out EBP and implementation science in a “real world” environment, without grant funding, a large team of researchers, or other factors that are common in peer-reviewed studies of EBP and implementation. The immediate next step with this line of inquiry is to do a full fidelity study on the updated CBT manual. We also plan to study the clinicians’ experience delivering the CBT manual as well as query patient experience before entering an “ongoing maintenance” phase of this project. Other similar facilities can utilize these findings and experiences, and they can hopefully expand upon this work.

Conclusions

The purpose of this study was to highlight an internal quality control effort that aimed to improve the group CBT training and manual of a small, inpatient addiction facility that treats SUDs. The process and changes made in this study may hold value to other similar facilities, as typically such studies are done with well-funded research grants. We wanted to highlight that it does not take significant resources to address the quality of group CBT. Being able to monitor the quality of EBP is important, and all facilities should do what is within their capacity. For our setting, the primary barriers were training, timing, and functionality, which we did our best to address following this study.

Statement on Informed Consent

Based on the Health and Human Services’ Common Rule (45 CFR 46), this study does not constitute human subject research. Ethics approval was not required, and informed consent was not acquired. This project was done with the intent of internal education and quality control. The data collected were minimal and non-identifiable. Further, the purpose of publishing the results of this study is to focus on the process such that other facilities may benefit; we are not significantly contributing to the field of addiction science from a prospective, hypothesis-driven perspective.

Acknowledgments

We would like to acknowledge all who worked tirelessly to make Wellbridge a reality and who continue to realize its purpose on a daily basis. This includes our clinical, nursing, admissions, administrative, food service, housekeeping, and maintenance staff.

Author Contributions

All authors contributed to data analysis, drafting, or revising of the article, have agreed on the journal to which the article will be submitted, gave final approval for the final version to be published, and agree to be accountable for all aspects of the work.

Funding

This project did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Disclosure

Dr Nehal Vadhan reports personal fees from Cutback Coach, outside the submitted work. The authors report no other conflicts of interest in this work.

References

1. Brownson RC, Proctor EK, Luke DA, et al. Building capacity for dissemination and implementation research: one university’s experience. Implement Sci. 2017;12(1):104. doi:10.1186/s13012-017-0634-4

2. Lomas J. Diffusion, dissemination, and implementation: who should do what? Ann N Y Acad Sci. 1993;703:226–237. doi:10.1111/j.1749-6632.1993.tb26351.x

3. Bourdon JL, Davies RA, Long EC. Four actionable bottlenecks and potential solutions to translating psychiatric genetics research. An Expert Review Public Health Genomics. 2020;23(5–6):171–183. doi:10.1159/000510832

4. Vinson CA, Stamatakis KA, Kerner JF. Dissemination and Implementation Research in Community and Public Health Settings. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; 2017:355–370.

5. McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments. A review of current efforts. Am Psychol. 2010;65(2):73–84. PMID: 20141263. doi:10.1037/a0018121

6. Patterson DA, McKiernan PM. Organizational and clinical implications of integrating an alcohol screening and brief intervention within non-substance abuse serving agencies. J Evidence Based Social Work. 2010;7(4):332–347. doi:10.1080/15433710903256880

7. Alatawi M, Aljuhani E, Alsufiany F, et al. Barriers of implementing evidence-based practice in nursing profession: a literature review. Am J Nursing Sci. 2020;9(1):35–42. doi:10.11648/j.ajns.20200901.16

8. Besterman AD, Moreno-De-Luca D, Nurnberger JI. 21st-century genetics in psychiatric residency training: how do we get there? JAMA Psychiatry. 2019;76(3):231–232. doi:10.1001/jamapsychiatry.2018.3872

9. Glasner-Edwards S, Rawson R. Evidence-based practices in addiction treatment: review and recommendations for public policy. Health Policy. 2010;97(2–3):93–104. doi:10.1016/j.healthpol.2010.05.013

10. Patterson Silver Wolf DA. Factors influencing the implementation of a brief alcohol screening and educational intervention in social settings not specializing in addiction services. Soc Work Health Care. 2015;54(4):345–364. doi:10.1080/00981389.2015.1005270

11. Patterson Silver Wolf DA, Ramsey AT, van den Berk-Clark C. Implementing outside the box: community-based social service provider experiences with using an alcohol screening and intervention. J Social Service Res. 2015;41(2):233–245. doi:10.1080/01488376.2014.980963

12. Patterson Silver Wolf DA, van den Berk-Clark C, Williams SL, Dulmus CN. Are therapists likely to use a new empirically supported treatment if required? J Social Work. 2018;18(6):666–678. doi:10.1177/1468017317743138

13. Landsverk J, Brown CH, Chamberlain P, et al. Design and analysis in dissemination and implementation research. In Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; 2018. 201–228.

14. Kirchner JE, Waltz TJ, Powell BJ, Smith JL, Proctor EK. Implementation Strategies. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; 2018:245–267.

15. US Department of Health and Human Subjects (HHS). Facing Addiction in America: The Surgeon General’s Report on Alcohol, Drugs, and Health. Washington, DC: US Department of Health and Human Subjects (HHS); 2016.

16. Louie E, Barrett EL, Baillie A, Haber P, Morley K. Implementation of evidence-based practice for alcohol and substance use disorders: protocol for systematic review. Syst Rev. 2020;9(1):1–6. doi:10.1186/s13643-020-1285-0

17. Arya S, Delic M, Ruiz BII, et al. Closing the gap between training needs and training provision in addiction medicine. BJ Psych International. 2020;17(2):37–39. doi:10.1192/bji.2019.27

18. Manuel JK, Hagedorn HJ, Finney JW. Implementing evidence-based psychosocial treatment in specialty substance use disorder care. Psych Addictive Beh. 2011;25(2):225–237. doi:10.1037/a0022398

19. de Andrade E, Quinn C, Allan J, Hides L. The effectiveness of residential treatment services for individuals with substance use disorders: a systematic review. Drug Alcohol Depend. 2019;201:227–235. doi:10.1016/j.drugalcdep.2019.03.031

20. McHugh RK, Hearon BA, Otto MW. Cognitive behavioral therapy for substance use disorders. Psychiatr Clin North Am. 2010;33(3):511–525. doi:10.1016/j.psc.2010.04.012

21. Morgenstern J, Blanchard KA, Morgan TJ, Labouvie E, Hayaki J. Testing the effectiveness of cognitive-behavioral treatment for substance abuse in a community setting: within treatment and posttreatment findings. J Consulting and Clinical Psych. 2001;69(6):1007–1017. doi:10.1037//0022-006x.69.6.1007

22. Morgenstern J, Morgan TJ, McCrady BS, Keller DS, Carroll KM. Manual-guided cognitive-behavioral therapy training: a promising method for disseminating empirically supported substance abuse treatments to the practice community. Psychol Addictive Beh. 2001;15(2):83–88. doi:10.1037/0893-164X.15.2.83

23. Amodeo M, Lundgren L, Fernanda Beltrame C, Chassler D, Cohen A, d’Ippolito M. Facilitating factors in implementing four evidence-based practices: reports from addiction treatment staff. Subst Use Misuse. 2013;48(8):600–611. doi:10.3109/10826084.2013.794838

24. Carroll KM. Lost in translation? Moving contingency management and cognitive behavioral therapy into clinical practice. Ann N Y Acad Sci. 2014;1327(1):94–111. doi:10.1111/nyas.12501

25. McGovern MP, Fox TS, Xie H, Drake RE. A survey of clinical practices and readiness to adopt evidence-based practices: dissemination research in an addiction treatment system. J Substance Abuse Treatment. 2004;26(4):305–312. doi:10.1016/j.jsat.2004.03.003

26. Morgenstern J, McKay JR. Rethinking the paradigms that inform behavioral treatment research for substance use disorders. Addiction. 2007;102(9):1377–1389. doi:10.1111/j.1360-0443.2007.01882.x

27. Morgenstern J, Kuerbis A, Shao S, et al. An efficacy trial of adaptive interventions for alcohol use disorder. J Substance Abuse Treatment. 2021;123:108264. doi:10.1016/j.jsat.2020.108264

28. Hepner KA, Hunter SB, Paddock SM, Zhou AJ, Watkins KE. Training addiction counselors to implement CBT for depression. Admin Policy Mental Health. 2011;38(4):313–323. doi:10.1007/s10488-011-0359-7

29. Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Joint Commis J Qual Patient Safe. 2008;34(4):228–243. doi:10.1016/s1553-7250(08)34030-6

30. Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Policy Mental Health. 2011;38(2):65–76. PMID: 20957426; PMCID: PMC3068522. doi:10.1007/s10488-010-0319-7

31. Creswell JW. Choosing A mixed methods design; 2006. Available from: https://www.sagepub.com/sites/default/files/upm-binaries/10982_Chapter_4.pdf. Accessed October 27, 2023.

32. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomedical Info. 2009;42:377–381. doi:10.1016/j.jbi.2008.08.010

33. Harris PA, Taylor R, Minor BL, et al. The REDCap consortium: building an international community of software partners. J Biomedical Info. 2019;95. doi:10.1016/j.jbi.2019.103208

34. Shoemaker SJ, Wolf MS, Brach C. Development of the Patient Education Materials Assessment Tool (PEMAT): a new measure of understandability and actionability for print and audiovisual patient information. Patient Educ Couns. 2014;96(3):395–403. doi:10.1016/j.pec.2014.05.027

35. Weiner BJ, Lewis CC, Stanick C, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12:1–12. doi:10.1186/s13012-017-0635-3

36. Cook JM, Thompson R, Schnurr PP. Perceived characteristics of intervention scale: development and psychometric properties. Assessment. 2015;22(6):704–714. doi:10.1177/1073191114561254

37. Microsoft Corporation. Microsoft Excel; 2018. Available from: https://office.microsoft.com/excel. Accessed October 27, 2023.

38. Bree RT, Gallagher G. Using Microsoft excel to code and thematically analyse qualitative data: a simple, cost-effective approach. All Ireland J Higher Educ. 2016;8:2.

39. Meyer DZ, Avery LM. Excel as a qualitative data analysis tool. Field Methods. 2009;21(1):91–112. doi:10.1177/1525822X08323985

40. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–1288. doi:10.1177/1049732305276687

41. D’Ippolito M, Lundgren L, Amodeo M, Beltrame C, Lim L, Chassler D. Addiction treatment staff perceptions of training as a facilitator or barrier to implementing evidence-based practices: a national qualitative research study. Substance Abuse. 2015;36(1):42–50. doi:10.1080/08897077.2013.849646

42. Olmstead TA, Abraham AJ, Martino S, Roman PM. Counselor training in several evidence-based psychosocial addiction treatments in private US substance abuse treatment centers. Drug Alcohol Depend. 2012;120(1–3):149–154. doi:10.1016/j.drugalcdep.2011.07.017

43. Cohen J, Mannarino AP. Disseminating and implementing trauma-focused CBT in community settings. Trauma Violence Abuse. 2008;9(4):214–226. doi:10.1177/1524838008324336

44. Hazell CM, Strauss C, Cavanagh K, Hayward M. Barriers to disseminating brief CBT for voices from a lived experience and clinician perspective. PLoS One. 2017;12(6):e0178715. doi:10.1371/journal.pone.0178715

45. Lewis CC, Simons AD. A pilot study disseminating cognitive behavioral therapy for depression: therapist factors and perceptions of barriers to implementation. Admin Policy Mental Health. 2011;38:324–334. doi:10.1007/s10488-011-0348-x

46. Newell JM, MacNeil GA. Professional burnout, vicarious trauma, secondary traumatic stress, and compassion fatigue: a review of theoretical terms, risk factors, and preventative methods for clinicians and researchers. Best Pract Ment Health. 2010;6(2):57–68.

47. McPeek-Hinz E, Boazak M, Sexton JB, et al. Clinician burnout associated with sex, clinician type, work culture, and use of electronic health records. JAMA Network Open. 2021;4(4):e215686. doi:10.1001/jamanetworkopen.2021.5686

48. Patel RS, Bachu R, Adikey A, Malik M, Shah M. Factors related to physician burnout and its consequences: a review. Beh Sci. 2018;8(98):1–7. doi:10.3390/bs8110098

49. Brendenberg E, Tietbohl C, Dafoe A, Thurman L, Calcaterra S. Identifying factors that contribute to burnout and resilience among hospital-based addiction medicine providers: a qualitative study. J Substance Abuse Treatment. 2023;144:1–7.

50. Horner G, Dadonna J, Burke DJ, Cullinane J, Skeer M, Wurcel AG. ‘You’re kind of at war with yourself as a nurse:’ Perspectives of inpatient nurses on treating people who present with a comorbid opioid use disorder. PLoS One. 2019;14(10):e0224335. doi:10.1371/journal.one.0224335

51. Shah MK, Grandrakota M, Cimiotti JP, Ghose N, Moore M, Ali MK. Prevalence and factors associated with nurse burnout in the US. JAMA Network Open. 2021;4(2):e2036469. doi:10.1001/jamanetworkopne.2020.36469

Creative Commons License © 2023 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.