Back to Journals » Advances in Medical Education and Practice » Volume 8

SAFE QI – a framework to overcome the challenges of implementing a quality improvement curriculum into a residency program

Authors Cheung L

Received 3 September 2017

Accepted for publication 31 October 2017

Published 1 December 2017 Volume 2017:8 Pages 779—784

DOI https://doi.org/10.2147/AMEP.S150718

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder

Download Article [PDF] 

Lawrence Cheung

Department of Medicine, University of Alberta, Edmonton, AB, Canada

Abstract: Quality improvement (QI) is an essential component of medical practice. Medical students and residents must learn the skills to conduct clinical QI during their educational programs. Medical educators must create and implement a curriculum in QI to empower their students to develop this skill and knowledge. However, developing and implementing a QI curriculum may be challenging for some residency programs. Residency programs with a relatively short duration of training – for example, only 2 years – may be unable to implement an extensive QI curriculum without siphoning away time for other learning objectives. Small residency programs may lack faculty with expertise to teach this topic. Residency programs with only a few residents may find it difficult to evaluate the success of a QI curriculum using robust statistical analysis. These residency programs need a QI curriculum with several features. The curriculum must be deliverable in a short period of time. There must be tools to assess the residents’ attainment of the curricular objectives. The curriculum must give the residents practical skills to develop their own QI initiatives. Finally, there must be simple methods to evaluate the curriculum’s effectiveness. To address these goals, we developed the SAFE QI (QI curriculum which is short, assessed, functional, and effective) framework for the 2-year subspecialty respirology residency program at the University of Alberta. There are 2–3 entrants per year for a total of 4–6 residents. This framework helps medical educators overcome the challenges of implementing a QI curriculum into their educational programs. This article illustrates how this framework was used to develop and deliver an institution’s own QI curriculum.

Keywords: curriculum development, quality improvement, residency education, clinical practice audit

 

Background

Postgraduate medical residents must learn skills to implement quality improvement (QI) into their medical practices.1,2 To accomplish this, medical educators must develop and implement curricula in clinical QI into their residency training programs.

For example, some residency programs have used an intensive 12-week process to teach QI.3 Other residency programs require their residents to study high-risk cases and document their assessment in a database which is reviewed by faculty.4 Mann et al found large variability in QI teaching amongst the pediatric residency programs they surveyed.5 QI curricula have been successfully integrated into many other residency programs including psychiatry,6 geriatrics,7 obstetrics-gynecology,8 radiology,9 and surgery.10

However, despite the success of many residency programs, a curriculum in QI may still be difficult to implement for certain residency programs.3,11 Some residency programs have a relatively short duration of training, for example, only 2 years. Their residents might have limited time to complete a QI project. Also, educators may want to keep the QI curriculum relatively short so that it does not consume too much of the total training period and siphon away time to learn other curricular objectives of the residency program. Some residency programs are small and have only a few residents. Compared to larger residency programs, smaller residency programs may have fewer faculty members. Thus, some residency programs may lack the faculty with the expertise to teach or demonstrate this topic.3 Quantitative curriculum evaluation, with robust statistical analysis, is often difficult to implement with small residency programs.12 They are often unable to accrue large numbers of subjects (i.e., residents) to study. Rather, curriculum evaluation often tends to focus on qualitative data collection. These data may include feedback from the learners about the quality of instruction or opinions from the faculty on the success of the QI initiatives.

To resolve these problems, some residency programs might combine their instructions with other residency programs to pool resources.13 However, this can lead to other problems.14 For example, the instruction may be too generic and less relevant for the residents’ needs. Or, the residency program may choose to rely solely on didactic teaching without giving the residents experiential learning to develop their own QI projects.

In 2011, the 2-year respirology residency program sought to overcome these challenges in implementing our QI curriculum at the University of Alberta. Two to three residents enter this program each year after completing 3 years of core internal medicine. There are a total of 4–6 subspecialty residents at a given time. This article reflected on the challenges for a small residency program and turned limitations into advantages.15 To do this, we first conducted a needs analysis. This involved surveying faculty, as well as past and current residents, to determine what we required to teach a QI curriculum. Based on this, we determined that our QI curriculum needed four essential features.

First, the curriculum needed to be short and focused. This was done so that residents could attain these learning objectives during their 2-year residency, yet leave sufficient time to teach the rest of the specialty’s training objectives. Second, we needed to assess whether the residents had achieved the curricular goals. This meant developing assessment tools for this purpose. Third, the curriculum needed to teach residents skills in QI that were functional. In other words, we deemed it insufficient to provide didactic instruction alone. Our residents needed experiential learning.7,16,17 We aimed to provide this hands-on learning with feedback so our residents could acquire the skills to complete QI projects in their future practice with the least waste of time and resources. Fourth, the curriculum delivery needed to be effective, achieving results that were valued and intended.18,19 Furthermore, our small residency program needed to use simple measures of effectiveness. In the case of QI, we sought to determine whether the residents’ QI initiatives led to improved clinical care, whether their work was valued by faculty, and whether the residents retained the skills to complete QI projects after residency.

We incorporated these four essential features into a pilot framework which we named SAFE QI. This acronym describes our QI curriculum which is short, assessed, functional and effective. The remainder of this article illustrates how other medical educators can use the SAFE QI framework to overcome the challenges of implementing their own QI curriculum into their training programs.

SAFE QI curriculum

The four elements of the SAFE QI curriculum are summarized in Table 1.

Table 1 Components of the SAFE QI curriculum

Abbreviation: SAFE QI, quality improvement curriculum which is short, assessed, functional, and effective.

Short

SAFE QI is a QI curriculum that can be delivered in a relatively condensed period. To do this, educators need to provide focused teaching. We chose to focus our teaching on the principles of conducting a clinical practice audit.2023 This would give our residents the knowledge and skills to initiate QI in their own practices. QI can also examine large systems of practice, such as within a hospital or entire health care region. However, these types of projects require considerable time and labor to plan and implement. It would be difficult for our residents to complete such large-scale projects within a 2-year residency.

We delivered the curriculum in two stages. The first stage occurred near the beginning of residency. To maximize learning, we provided the initial instruction using a 1-hour buzz group where the larger group was split into pairs to discuss the topic, and then reformed for a whole group discussion.24 Residents participated in this small group session and learned the 14 steps of performing a clinical practice audit, such as choosing a topic, choosing a criterion standard, and writing out the primary and secondary audit questions.20 We delineated each of these steps using post-discharge care of patients admitted with exacerbation of COPD as an example. We then used brainstorming to apply these steps to other clinical activities.

In the second stage of curriculum delivery, we gave the residents tools to complete their own clinical practice audit. We created a checklist of items that residents needed to include in their audit, as well as a template for writing their report. The report template described the format we expected, including the background, methods, subjects, data collection, results, and discussion. The template also specified the contents which should be included, such as the primary and secondary audit questions, the criterion standard against which current clinical practice was being compared, and the data collection methods. Residents were given up to 15 months to develop and complete their project, and to submit their report. They were allotted 3 hours of scheduled time per month to work on their projects. Every 3 months, the residency program director met with the residents to review their progress. During these meetings, the residents gave updates on the number of steps they had completed in their project and described any barriers they encountered toward completing these steps.

Assessed

During curriculum development, educators need to plan how to assess whether the residents have attained the curricular objectives. According to Miller’s pyramid,25 this can include examinations to test their knowledge and/or understanding or observations of their performance. Alternatively, work-based assessment can involve analyzing the residents’ outcomes, process of care, or volume.26

Rather than simply assessing their recall of our instruction, we chose to assess the residents’ skills in performing an actual clinical practice audit. To achieve this, we created a scoring sheet and rating scale to assess the quality of those audits (Table 2). The scoring sheets assessed whether the audits contained all the requisite elements including the use of an appropriate criterion standard, a clear primary question, clear secondary questions, collection of appropriate data, identification of any gaps between actual and ideal practice, and feasible recommendations to ameliorate those gaps. Each item would then receive a rating. For example, lack of an appropriate criterion standard would receive a rating of “Failed to Meet Expectations” for that item. Providing an appropriate criterion standard would receive a rating of “Met Expectations”. Cogently explaining the choice of that criterion standard, and providing sound reasons for not choosing other criterion standards, would receive a rating of “Exceeded Expectations”. The scoring sheet also contained a global rating score to assess the overall quality of the audit. After grading the projects, we provided teaching feedback.

Table 2 Quality improvement project assessment form

Functional

Educators should design and deliver a curriculum that provides functional knowledge and skills. That is, the residents should be able to take what they have learned and apply it in clinical practice. Unfortunately, this does not always occur.2729 In the case of QI, residents should attain the competencies to complete their own project, with little wasted time and resources. To accomplish this curricular goal, we did two things.

First, we demonstrated how to identify areas in need of QI. Our department compiled a list of clinical service areas that might benefit from a clinical practice audit. These clinical areas included outpatient clinics, inpatient wards, and diagnostic and therapeutic services such as the bronchoscopy unit, pulmonary function laboratory, and sleep study facility. From this list, we brainstormed ideas with the residents to identify specific topics that could be suitable for an audit. For example, topics included complication rates after bronchoscopy, accuracy of pulmonary function test interpretations compared to established guidelines, and appropriate use of pulmonary rehabilitation and medications on discharge after admission to hospital for an exacerbation of COPD. By having faculty and residents work together in this process, we promoted the hidden curriculum30 that QI was valued. We then asked the residents to choose a topic that interested them or to develop their own. To efficiently use their time and resources, we encouraged them to do retrospective analysis of data that had already been collected.

Second, while residents were conducting their clinical practice audits, we identified and corrected a few practical barriers hindering their progress. For example, our residents found it difficult to access clinical data during office hours while they were on other busy clinical rotations. Therefore, we gave them after-hours access to the data and protected time off their rotations during the day. Some hospital-based topics needed ethics approval before the projects could proceed. We discussed the reasons for this31,32 with our residents and helped them obtain approval from our research ethics board when needed. We initially met with each resident individually every 3 months to review his or her progress, but later found it helpful to add group meetings with all residents every 6 months to discuss common barriers hindering their projects.

Effective

Before curriculum implementation, medical educators should plan as to how they will determine if the curriculum delivery is effective.33 In other words, the curriculum should successfully produce the desired outcomes, and these outcomes should serve a valued purpose. Some residency programs need simple measures of effectiveness as they lack sufficient numbers of residents to perform robust, quantitative statistical analysis. For our QI curriculum, we chose to evaluate three outcomes.

First, we wanted to ensure that the residents’ projects were useful. We sent the residents’ clinical audit reports to the faculty engaged in the clinical activity that was being audited. We then asked for the faculty’s feedback. Specifically, we asked if the residents’ reports were clear and if their recommendations for improvement were feasible. Second, we examined whether the residents retained confidence in their QI knowledge after finishing their residency. Third, we wanted to analyze whether the residents’ clinical practice audits improved clinical care. We retained all clinical practice audit reports. After spending 4 years to implement the recommendations, we have asked new, incoming residents to repeat the audits to determine if improvement had occurred.

Curriculum evaluation

After we introduced our QI curriculum, we evaluated the curriculum’s effectiveness over a 4-year period. During this time, 11 residents completed our residency program. All of them completed a QI project for a total of seven projects. Some of the residents worked in groups while others did projects alone.

Using a five-point Likert scale (1-strongly disagree, 2-disagree, 3-neither agree nor disagree, 4-agree, 5-strongly agree), we surveyed the nine faculty members who worked in the clinical areas that were audited by the residents. Of them, seven faculty responded to the survey. When asked if the residents’ audit reports were clear, three faculty responded “agree” and four faculty responded “strongly agree”. When asked if the residents’ recommendations for improvement were feasible, one faculty responded “agree” and six faculty responded “strongly agree”.

A year after finishing their residency, we surveyed the nine residents who had completed the residency program in the first 3 years out of our 4-year evaluation period. We received eight responses to our survey from these former residents. When asked if they felt confident implementing a QI initiative in their own practice, five respondents answered “agree” and three respondents answered “strongly agree”. In addition, four of these former residents indicated that they had already started QI initiatives in their own clinical practices.

It has been 4 years since we implemented our SAFE QI curriculum. We are now asking incoming residents to repeat all the earlier projects done by previous residents. Over the next 4 years, we plan to analyze the data to determine the extent to which the residents’ QI projects have led to improvements in clinical care.

Conclusion

Clinical QI is a vital component of medicine. Medical educators can use our SAFE QI framework to implement a QI curriculum into their own educational programs. We successfully implemented this into our small subspecialty residency program. In doing so, we efficiently satisfied our national accreditation requirements for residency training and taught our residents a useful skill for their own clinical practices. Our residents can now use their basic knowledge to collaborate with other health disciplines and participate in system-wide institutional QI initiatives. We are repeating all our QI projects to determine if clinical care has improved. Other small medical or surgical residency programs can also use this framework to achieve these goals.

Further work can be done to generalize the applicability of the SAFE QI curriculum into larger residency programs that possess greater numbers of residents and faculty. In so doing, residents in programs with a longer duration of training can experience the entire spectrum of QI from start to finish.

Acknowledgment

The author would like to thank the Respirology Residency Program Committee at the University of Alberta for approving the curriculum.

Disclosure

The author reports no conflicts of interest in this work.

References

1.

Nasca TJ, Philibert I, Brigham T, Flynn TC. The Next GME Accreditation System – rationale and benefits. N Engl J Med. 2012;366(11):1051–1056.

2.

Frank JR, Snell L, Sherbino J. CanMEDS 2015 Physician Competency Framework. 2015. http://canmeds.royalcollege.ca/uploads/en/framework/CanMEDS%202015%20Framework_EN_Reduced.pdf. Accessed January 10, 2017.

3.

Chase SM, Miller WL, Shaw E, Looney A, Crabtree BF. Meeting the challenge of practice quality improvement: a study of seven family medicine residency training practices. Acad Med. 2011;86(12):1583–1589.

4.

Strayer RJ, Shy BD, Shearer PL. A novel program to improve patient safety by integrating peer review into the emergency medicine residency curriculum. J Emerg Med. 2014;47(6):696–701.e692.

5.

Mann KJ, Craig MS, Moses JM. Quality improvement educational practices in pediatric residency programs: survey of pediatric program directors. Acad Pediatr. 2014;14(1):23–28.

6.

Reardon CL, Ogrinc G, Walaszek A. A didactic and experiential quality improvement curriculum for psychiatry residents. J Grad Med Educ. 2011;3(4):562–565.

7.

Callahan KE, Rogers MT, Lovato JF, Fernandez HM. A longitudinal, experiential quality improvement curriculum meeting ACGME competencies for geriatrics fellows: lessons learned. Gerontol Geriatr Educ. 2013;34(4):372–392.

8.

Sepulveda D, Varaklis K. Implementing a multifaceted quality-improvement curriculum in an obstetrics-gynecology resident continuity-clinic setting: a 4-year experience. J Grad Med Educ. 2012;4(2):237–241.

9.

Krajewski K, Siewert B, Yam S, Kressel HY, Kruskal JB. A quality assurance elective for radiology residents. Acad Radiol. 2007;14(2):239–245.

10.

Sellers MM, Hanson K, Schuller M, et al. Development and participant assessment of a practical quality improvement educational initiative for surgical residents. J Am Coll Surg. 2013;216(6):1207–1213, 1213 e1201.

11.

Kelz RR, Sellers MM, Reinke CE, Medbery RL, Morris J, Ko C. Quality in-training initiative – a solution to the need for education in quality improvement: results from a survey of program directors. J Am Coll Surg. 2013;217(6):1126–1132.e1121–1125.

12.

Reed DA. Nimble approaches to curriculum evaluation in graduate medical education. J Grad Med Educ. 2011;3(2):264–266.

13.

Tasman A, Riba M. Strategic issues for the successful merger of residency training programs. Hosp Community Psychiatry. 1993; 44(10):981–985.

14.

Grossman RI, Berne R. Commentary: less is better: lessons from the New York University-Mount Sinai merger. Acad Med. 2010;85(12):1817–1818.

15.

Morgan A, Barden M. A Beautiful Constraint: How to Transform Your Limitations Into Advantages, and Why It’s Everyone’s Business. Hoboken, NJ: John Wiley & Sons, Inc; 2015.

16.

Karimi R, Arendt CS, Cawley P, Buhler AV, Elbarbry F, Roberts SC. Learning bridge: curricular integration of didactic and experiential education. Am J Pharm Educ. 2010;74(3):48.

17.

Hall Barber K, Schultz K, Scott A, Pollock E, Kotecha J, Martin D. Teaching quality improvement in graduate medical education: an experiential and team-based approach to the acquisition of quality improvement competencies. Acad Med. 2015;90(10):1363–1367.

18.

Kim CS, Lukela MP, Parekh VI, et al. Teaching internal medicine residents quality improvement and patient safety: a lean thinking approach. Am J Med Qual. 2010;25(3):211–217.

19.

Zafar MA, Diers T, Schauer DP, Warm EJ. Connecting resident education to patient outcomes: the evolution of a quality improvement curriculum in an internal medicine residency. Acad Med. 2014;89(10):1341–1347.

20.

Godwin M. Conducting a clinical practice audit. Fourteen steps to better patient care. Can Fam Physician. 2001;47:2331–2333.

21.

Donald AG. Why medical audit? Practitioner. 1980;224(1350):1278–1279.

22.

Baker R. Audit and standards in new general practice. BMJ. 1991;303(6793):32–34.

23.

Seddon M, Buchanan J. Quality improvement in New Zealand healthcare. Part 3: achieving effective care through clinical audit. N Z Med J. 2006;119(1239):U2108.

24.

Jones RW. Learning and teaching in small groups: characteristics, benefits, problems and approaches. Anaesth Intensive Care. 2007;35(4):587–592.

25.

Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63–67.

26.

Norcini JJ. Work based assessment. BMJ. 2003;326(7392):753–755.

27.

Goldacre MJ, Lambert TW, Svirko E. Foundation doctors’ views on whether their medical school prepared them well for work: UK graduates of 2008 and 2009. Postgrad Med J. 2014;90(1060):63–68.

28.

Goldacre MJ, Lambert T, Evans J, Turner G. Preregistration house officers’ views on whether their experience at medical school prepared them well for their jobs: national questionnaire survey. BMJ. 2003;326(7397):1011.

29.

Cave J, Goldacre M, Lambert T, Woolf K, Jones A, Dacre J. Newly qualified doctors’ views about whether their medical school had trained them well: questionnaire surveys. BMC Medl Educ. 2007;7:38–38.

30.

Hafferty FW. Beyond curriculum reform: confronting medicine’s hidden curriculum. Acad Med. 1998;73(4):403–407.

31.

Nerenz DR, Stoltz PK, Jordan J. Quality improvement and the need for IRB review. Qual Manag Health Care. 2003;12(3):159–170.

32.

Weiserbs KF, Lyutic L, Weinberg J. Should quality improvement projects require IRB approval? Acad Med. 2009;84(2):153.

33.

Ried LD. A model for curricular quality assessment and improvement. Am J Pharm Educ. 2011;75(10):196.

Creative Commons License This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.

Download Article [PDF]