Back to Journals » Advances in Medical Education and Practice » Volume 10

The Button Project: Using Chart Rounds for Teaching Clinical Ophthalmology with an Electronic Medical Record

Authors Rosdahl JA , Zhang W , Manjunath V 

Received 2 November 2019

Accepted for publication 24 November 2019

Published 13 December 2019 Volume 2019:10 Pages 1039—1044

DOI https://doi.org/10.2147/AMEP.S237076

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Prof. Dr. Balakrishnan Nair



Jullia A Rosdahl, Wenlan Zhang, Varsha Manjunath

Department of Ophthalmology, Duke Eye Center, Duke University, Durham, NC, USA

Correspondence: Jullia A Rosdahl
Department of Ophthalmology, Duke Eye Center, Duke University, DUMC 3802, 2351 Erwin Road, Durham, NC 27710, USA
Email [email protected]

Objective: Chart rounds have traditionally been used effectively for clinical teaching in ophthalmology. The introduction of the electronic health record has altered practice patterns and some evidence suggests interference with resident education. The purpose of this study was to investigate the use of chart rounds in our ophthalmology department and to see if a simple intervention, an “education button”, could positively impact clinical teaching.
Design: We used a cross-sectional survey, and pre- and post-intervention surveys to assess the utility of an intervention – an “education button”.
Setting: Department of Ophthalmology at Duke University, a tertiary care academic ophthalmology practice, in Durham, North Carolina.
Participants: Ophthalmology trainees (37), including residents and clinical fellows, and clinical faculty (50) in the department were surveyed anonymously. The overall response rate for the cross-sectional survey was 83% (72/87). The overall response rate for the educational study was 53% for the first time-point and 59% for the second time-point.
Results: For the cross-sectional survey, trainees found chart rounds to be useful and would like to increase their frequency. Most faculty reported doing them regularly, although not having enough time was the most common barrier (76% of the faculty). In the pre- and post-assessment of the “education button” (overall response rate 53%), the overall impression was positive with the button easy to use, but the implementation of the button did not appear to change the quality or frequency of chart rounds; nor did it appear to have an effect on covering learning objectives.
Conclusion: While the “education button” could help with communication between the faculty and trainees during a busy clinic session to identify cases for discussion, it did not address the most common barrier identified by faculty members, that of not having enough time.

Keywords: electronic medical record, electronic health record, resident education, fellow education, chart rounds, digital


Introduction

Clinical teaching in the age of the electronic medical record (EMR) can be challenging, particularly in the outpatient setting. Some studies have suggested that the electronic charting programs containing counseling prompts and templates can aid in resident education1,2 and may be useful in assessing resident competence.3 However, several studies suggest that electronic health records may interfere with educational time resulting in negative effects on the teaching-learning interaction and clinical reasoning.46

The use of chart rounds—discussion of patient cases by the clinical teaching team while using the chart in the out-patient setting—has been shown to be a useful method for resident education.7,8 Several of the Accreditation Council for Graduate Medical Education (ACGME) competencies9 can be addressed with systematic and consistent use of chart rounding, including patient care, medical knowledge, practice-based learning and improvement, and system-based practice.

Prior to the wide-spread adoption of electronic charting, there were multiple ways for a member of the clinical teaching team to temporarily flag a chart for discussion during chart rounds, such as using a sticky note or setting the chart aside, but these methods are no longer possible in the electronic chart. We sought to fill this educational gap: the need to flag a chart electronically that was easy to use, did not require additional software upgrades, and that would not leave any non-clinical residue on the electronic health record. First, we used a cross-sectional survey to investigate the use of chart rounds in our department among both faculty members and trainees. Then, we developed a simple tool to flag charts for discussion during chart rounds, the “education button”. Pre- and post-intervention surveys were performed before and after its implementation to assess its utility in our department.

Materials and Methods

These cross-sectional surveys and prospective educational studies were reviewed by the Duke University Institutional Review Board and found to be exempt. It was completed at the Duke University Department of Ophthalmology during June–July 2016 (cross-sectional survey) and October 2016-March 2017 (prospective study).

Cross-Sectional Survey

Anonymous paper surveys were distributed to all ophthalmology residents and clinical fellows (n=37), and clinical faculty (n=50) of our ophthalmology department during faculty meetings and resident didactic sessions. A cover sheet describing the survey with implied consent language was distributed with each survey. The survey for the residents and fellows included 7 questions: year of training; questions pertaining to the definition of chart rounds, current utility, frequency, and type of chart rounds; and an open-ended question soliciting ideas about how the electronic health record could be used to enhance teaching. The survey for the clinical faculty was similar and included the 6 questions about chart rounds. The survey questions were created by the authors with feedback and revision from non-Duke ophthalmologists; the survey was reviewed by the Duke Social Sciences Research Institute consultation service. The data were collected and analyzed using Microsoft Excel. Description analyses of the multiple-choice questions are presented with qualitative comparisons due to the small sample size.

Intervention Development

The responses from the open-ended question in the cross-sectional survey resulted in a number of potential interventions to use the EMR to enhance teaching which were explored in-depth with our in-house Informational Technology analyst. The institution of the “education button” was identified as most feasible for implementation, as no additional technological changes were needed since an existing feature in the Epic software could be used. Departmental leadership supported the use of the button, and the clinical staff was engaged in the development of the project (in particular, the choice of the color of the button, to minimize disruption to services that were already using the colored buttons for clinic flow). The intervention itself (the “education button”) consisted of using the yellow-colored dot, adjacent to each patient name, on the main schedule page in Epic. Any member of the care team, such as the resident or attending seeing the patient, could select the yellow dot for any patient that they wanted to discuss with the team. Then, during chart rounds, the team could easily visualize which patients to focus on during the teaching time. Prior to instituting the “education button”, the faculty, trainees, and clinical staff were informed of the project via email and during in-person faculty and staff meetings.

Prospective Educational Study, “The Button Project”

A pre- and post-intervention assessment was done for the educational intervention. The “education button” was instituted department-wide in December 2016. Anonymous paper surveys were distributed to all ophthalmology residents, clinical fellows, and clinical faculty of our ophthalmology department at two time-points, 1) just before the institution of the “education button” and 2) 2 months after instituting the button, such that the trainees and faculty would have used the button for one complete training rotation (8 weeks). A cover sheet describing the survey with implied consent language was distributed with each survey. Surveys at both time-points included one question on the overall impression of the educational experience compared to the previous rotation and several questions on the frequency, quality, and specifications of the chart rounds during the previous rotation, with 5-point Likert scales, as well as a series of questions specific to the sub-specialty learning objectives. The questions were developed by the authors with feedback and revision from non-Duke ophthalmologists and it was reviewed by the Duke Social Sciences Research Institute consultation service. The learning objectives were based on the documented residency rotation goals and objectives. The surveys at the second time-point also included questions on the use of the education button, with 5-point Likert scales, and an open-ended question soliciting comments on the survey or the button. The data were collected and analyzed using Microsoft Excel; descriptive analyses of the multiple-choice questions are presented with qualitative comparisons due to the small sample size. For the learning objectives, the ratings for each specialty were averaged, with standard deviations calculated.

Results

Cross-Sectional Survey on Chart Rounds

The overall response rate for the cross-sectional survey was 83% (72/87), with surveys returned from 86% (43 of 50) of clinical faculty and 78% (29 of 37) of trainees.

Most respondents defined chart rounds as reviewing select cases at the end of the clinic, Table 1. Most trainees reported that chart rounds have been either useful or very useful for their clinical education. Most faculty members reported that they have chart rounds at least sometimes although 12% reported that they do not do them at all. The most common barrier identified was not having enough time (in 76% of the faculty). Respondents reported that residents, fellows, and medical students commonly participated, as well as ophthalmic technicians, ophthalmic technician students, office staff, and visitors. Most commonly, the chart rounds focused on rare cases, management challenges, or diagnostic dilemmas. Most trainees reported that they would like to increase the frequency of chart rounds with clinical faculty. Most respondents reported that, if there was an easy way to identify cases relevant for chart rounds through the EMR, they would be likely or very likely to use it.

Table 1 Cross-Sectional Survey on Chart Rounds

The Button Project

The overall response rate for the educational study was 53% for the first time-point (57%, 21 of 37 faculty; 48%, 13 of 27 trainees) and 59% for the second time-point (68%, 21 of 31 faculty; 52%, 17 of 33 trainees). Amongst the faculty respondents, all sub-specialties were represented, with the highest participation from retina and glaucoma faculty; amongst fellow respondents: glaucoma, retina, cornea, and pediatric ophthalmology; resident respondents on rotations: glaucoma, retina, oculo-plastics, comprehensive, and pediatric ophthalmology.

The overall impression of the educational experience was positive for both faculty and trainees, both before and after implementation of the educational button, Table 2. However, while the impression of the faculty became slightly more positive after implementation of the education button, the impression of the trainees became less positive. There was a mismatch between faculty and trainee estimates of the frequency of chart rounds. About 60% of the faculty but less than 25% of the trainees noted chart rounds in most or all clinic sessions, and this did not appear to change with the implementation of the button. Implementation of the button did not appear to improve or harm the quality of the chart rounds, with most respondents reporting positive or very positive impressions of chart rounds both before and after implementation.

Table 2 Before and After Implementation of the Education Button

With regard to the learning objectives, faculty were asked to rate their coverage of the learning objectives, and implementation of the button did not change their ratings. The trainees were asked to rate their comfort in handling the sub-specialty learning objectives. Over the time period assessed, the fellows appeared to improve whereas the residents’ scores did not.

The use of the education button was also assessed. Most respondents did not find it difficult to use the button, with 77% of the faculty and 92% of the trainees reporting that it was neutral or easy to use. However, 3 clinical faculty reported that it was “very hard” to use. With regards to efficacy, most respondents reported that the effect of the button was neutral or positive on their chart rounds. Notably, no one responded very negative or negative. When asked if they would continue to use the button, there was a trend towards positivity, with half of the faculty reporting that they were likely or very likely to continue using the button, but most respondents were more neutral.

The open-ended responses revealed some of the barriers to implementing the button. For example, the button use was confused for clinic flow instead of education: “lots of techs change back the button color,” or confusion about the button, “where is the button?” Also, a number of respondents noted that they were already doing chart rounds in a way that worked. For example, “The button is a reminder tool. It is easy but I have done teaching as we go, so button labels are irrelevant.”

Discussion

While others have proposed ways to teach and evaluate clinical documentation skills using the EMR in the context of the ACGME core educational competencies,10 this study investigated how EMR could be used to facilitate the familiar and well-established practice of chart rounding to aid in resident and fellow teaching and learning in ophthalmology.

We found that ophthalmology trainees perceived chart round positively and that most faculty used chart rounding in their clinical teaching. We also found there was significant agreement amongst the faculty and trainees with regards to what chart rounding is, how it is incorporated in the clinic, and the types of cases that are used. In the initial survey, there was strong interest from trainees in increasing the frequency of chart rounds and interest from both trainees and faculty in an electronic chart-based tool for identifying cases for discussion. However, the most common barrier from faculty was not having enough time for chart rounds, which was not addressed with the use of the “education button”.

After the “education button” was implemented department-wide, we did not find an effect on the frequency of chart rounds, quality of chart rounds, or on learning. There was a slight decrease in positive perception of the educational experience in the trainees; this was likely due to the barriers of limited time and competing duties, not the use of the “education button”. Our results may be due to an ineffectual intervention, but there are several confounders that may contribute to our findings. The time period studied was quite short (approximately 8 weeks, the duration of one rotation), although a single rotation was chosen so that rotation-specific learning objectives could be assessed. A longer duration for the assessment may have helped with the wider adoption of the education button. Despite education about the education button (via email and at faculty meetings) and incorporating clinical staff in the development of the button, there was incomplete penetration of the use of the education button, evidenced both in the open-ended question in the survey and with personal communications to the authors. One of the responses from the survey summarizes the effect of the education button as implemented: “It increased the amount of times I chart rounded but I did not always use the buttons – we would just run the list and pick out interesting cases.” Table 3 summarizes several recommendations to consider for future educational interventions.

Table 3 Recommendations for Future Clinical Educational Interventions

Limitations of this work include the response rate, limited adoption of the “education button” by the clinical team as noted above, and that the intervention did not directly address the biggest barrier identified (lack of time). The response rates between the cross-sectional survey and the button surveys differ due to limitations in survey distribution (primarily due travel in the winter months, for the button surveys) whereas the cross-sectional survey was distributed during June when most faculty and trainees were available. While the intervention did not address the lack of time, implementing this intervention required no changes to the EMR and it was felt to be feasible clinically and had support from key faculty and leadership, thus, a good place to start. Future interventions will likely need to address this barrier to be successful.

Conclusions

We found that chart rounding is still commonly used in our department for clinical teaching, and trainees find it to be useful in their clinical education and would like more chart rounds with their faculty. The simple intervention, using an “education button”, to identify cases for discussion, was noted to be easy to use but did not change the quality or frequency of chart rounds, nor did it improve coverage of learning objectives over the 8-week study period. The chief barrier faculty noted to doing chart rounds more often was not having enough time, which was not addressed by this intervention.

Disclosure

The authors report no conflicts of interest in this work.

References

1. Sutter MB, Magee SR. Resident education through electronic medical record counselling prompts. Med Educ. 2010;44(5):513–514. doi:10.1111/med.2010.44.issue-5

2. Shirazian S, Wang R, Moledina D, et al. A pilot trial of a computerized renal template note to improve resident knowledge and documentation of kidney disease. Appl Clin Inform. 2013;4(4):528–540. doi:10.4338/ACI-2013-07-RA-0048

3. Weller J. What can electronic anesthesia records tell us about resident competence?. Anesthesiology. 2016;124(2):259–260.

4. Schenarts PJ, Schenarts KD. Educational impact of the electronic medical record. J Surg Educ. 2012;69(1):105–112. doi:10.1016/j.jsurg.2011.10.008

5. Bloom MV, Huntington MK. Faculty, resident, and clinic staff’s evaluation of the effects of EHR implementation. Fam Med. 2010;42(8):562–566.

6. Ko CY, Escarce JJ, Baker L, Sharp J, Guarino C. Predictors of surgery resident satisfaction with teaching by attendings: a national survey. Ann Surg. 2005;241(2):373–380. doi:10.1097/01.sla.0000150257.04889.70

7. Fogarty GB, Hornby C, Ferguson HM, Peters LJ. Quality assurance in a radiation oncology unit: the chart round experience. Australas Radiol. 2001;45(2):189–194. doi:10.1046/j.1440-1673.2001.00901.x

8. Snydman L, Chandler D, Rencic J, Sung YC. Peer observation and feedback of resident teaching. Clin Teach. 2013;10(1):9–14. doi:10.1111/j.1743-498X.2012.00591.x

9. Lee AG. The new competencies and their impact on resident training in ophthalmology. Surv Ophthalmol. 2003;48(6):651–662. doi:10.1016/j.survophthal.2003.08.009

10. Stephens MB, Gimbel RW, Pangaro L. Commentary: the RIME/EMR scheme: an educational approach to clinical documentation in electronic medical records. Acad Med. 2011;86(1):11–14. doi:10.1097/ACM.0b013e3181ff7271

Creative Commons License © 2019 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.