Back to Journals » Advances in Medical Education and Practice » Volume 10

Treating and teaching: using publicly available data to explore the relationship between student and patient evaluations of teaching hospitals

Authors Gauer JL , van den Hoogenhof S , Rosenberg ME

Received 26 October 2018

Accepted for publication 11 April 2019

Published 10 June 2019 Volume 2019:10 Pages 405—409

DOI https://doi.org/10.2147/AMEP.S192304

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 3

Editor who approved publication: Dr Md Anwarul Azim Majumder



Jacqueline L Gauer, Suzanne van den Hoogenhof, Mark E Rosenberg

Office of Medical Education, University of Minnesota Medical School, Minneapolis, MN 55455, USA

Introduction: Treating patients and teaching medical students are parallel activities that occur at teaching hospitals. However, the relationship between these activities is poorly understood. There have been multiple calls for assessing the quality of medical education by examining publicly available clinical data but there is minimal evidence linking these variables.
Method: In this proof-of-principle study, the authors examined publicly available Hospital Consumer Assessment of Healthcare Providers and Systems (H-CAHPS)Ⓡ, data collected during Calendar Year 2013 to explore the relationship between patient evaluations of their hospital experience and medical student evaluations of the educational experience at that site.
Results: Pearson product–moment correlation coefficients were calculated for multiple variables. Patient ratings of doctor–patient communication correlated with student ratings of organization (R=0.882, p=0.048), educational value (R=0.882, p=0.048), teaching (R=0.963, p=0.008), and evaluation and feedback (R=0.920, p=0.027).
Conclusion: These findings provide preliminary evidence for a relationship between patient experiences and the quality of education at that site. Further studies linking clinical and education outcomes are needed to explore this relationship in more depth. The contributions of specific hospital locations, providers, or clerkships need to be evaluated. Studies examining these relationships have the potential to improve both patient care and medical education.

Keywords: student satisfaction, patient satisfaction, clinical teaching, quality, doctor-patient communication, integrating educational practice data

 

Introduction

One of the ultimate measures of a successful medical school is the quality of care provided by the physicians trained at that school. However, current medical school evaluation efforts typically do not include patient data, due in large part to the logistical difficulty of collecting and analyzing patient-related data.16 One possible solution to some of the logistical concerns is to use publicly available data. Governments, non-profits, and other organizations collect a large amount of patient data during the course of their health care-related activities and have begun to publish this data online. Medical education researchers can leverage this data to begin to evaluate medical education using measures such as practice specialty and location, quality of the care experience, and patient health outcomes.

For this study, we used publicly available data to explore the relationship between the medical education program and patient care. Treating patients and educating medical students are parallel activities that occur at teaching hospitals. However, the relationship between these activities is poorly understood. Often, tensions exist between the patient care and educational missions of clinics and hospitals. We hypothesized that a relationship would exist between patient evaluations of their hospital experience with medical student evaluations of the educational experience at that site.

Method

Institutional approval

Ethical approval for this research was granted by the Institutional Review Board at the University of Minnesota.

Participants

The participants for this study were students at the University of Minnesota Medical School who completed course evaluation forms for required clerkships at the five Twin Cities teaching hospitals under examination during calendar year 2013 (CY13). The five hospitals were University of Minnesota Medical Center, Hennepin County Medical Center, Abbott Northwestern Hospital, Regions Hospital, and Methodist Hospital. A total of 1,512 evaluations were included in the study.

Sources of data

Student evaluations

Clerkship evaluation data were collected from records held by the Office of Medical Education in the Medical School at the University of Minnesota. Clerkship evaluations were administered at the conclusion of each clerkship. The range of clerkship evaluation response rates at the sites under study in CY13 was 97–100%. Items included in the course evaluations were: mistreatment, learning environment, organization of course, educational value, teaching, evaluation and feedback, experience as a health care team member, and balance of supervision and autonomy. The specific evaluation items that students were asked to respond to are listed in Table 1. Students rated each item on a Likert-type scale ranging from 1 (Below Expectations) to 3 (Exceeded Expectations). For this analysis, we calculated the average scores for each item for each site for all students submitting evaluations related to that site, regardless of which core clerkship they were evaluating.

Table 1 Clerkship evaluation items. Items included in end-of-rotation clerkship evaluations completed by medical students at the University of Minnesota in Calendar Year 2013

Patient experience of care ratings

Patient experience of care ratings of the hospitals under examination was derived from information publicly available on the Minnesota Community Measurement (MNCM)-hosted website MNHealthScores.org. Similar data are available on the Centers for Medicare & Medicaid Services (CMS) Hospital Compare website (www.medicare.gov/hospitalcompare). These data were collected via the Hospital Consumer Assessment of Healthcare Providers and Systems (H-CAHPS) Survey, a standardized survey instrument for measuring patients’ perceptions of their hospital experience. The survey was developed by CMS, along with the Agency for Healthcare Research and Quality (AHRQ) and includes ratings for ten care domains. Patients receive a survey for every inpatient stay and are asked to rate the frequency of events during their care (never, sometimes, usually, always). The data used in this study were collected from hospital admission dates January 1–December 31, 2013 (CY13). Further information on the data collection methods used by MNCM is available in the downloadable 2015 Health Care Quality Report published publicly on their website.7 The ten care domains included in the version of the H-CAHPS Survey used for this data were: cleanliness of hospital environment, doctor–patient communication, medication explanations provided, nurse-patient communication, pain controlled, overall hospital rating, quietness of hospital environment, receiving help when wanted, recommend hospital, and recovery information provided. The result for each care domain is derived from the composite of several survey questions and is reported by CMS as the “top box” average, which is the average rate of the most positive response.

Analyses

Using SPSS Statistics v.22 (IBM: Armonk, NY), we calculated Pearson product–moment correlation coefficients for each of the ten care domains on the H-CAHPS Survey with each of the nine items rated in our clerkship evaluations, for the five hospitals included in this study.

Results

The mean ratings for student evaluations of clerkships for each evaluation component, as well as the count of students who submitted evaluations for that site, can be found in Table 2. Data from the H-CAHPS survey for each care domain for each site can be found in Table 3. Data on individual patients are not available. The majority of correlation coefficients calculated were not significant. However, we did find significant correlations between patient ratings of doctor–patient communication and student ratings of organization of course (R=0.882, p=0.048), educational value (R=0.882, p=0.048), teaching (R=0.963, p=0.008), and evaluation and feedback (R=0.920, p=0.027). We also found a significant correlation between patient ratings of quietness of hospital environment and student ratings of experience as a health care team member (R=0.921, p=0.026).

Table 2 Clerkship evaluations. Number of end-of-rotation clerkship evaluations completed (N), and mean rating on each evaluation item per site, in Calendar Year 2013. Scale is 1 to 3 with 1=Below expectations, 2=Meets expectations, 3=Exceeds expectations, except where indicated in Table 1

Table 3 H-CAHPS Survey ratings of sites. Top-box averages of patient responses on the Hospital Consumer Assessment of Healthcare Providers and Systems (H-CAHPS) Survey in Calendar Year 2013, per site. The top-box average represents the proportion of respondents selecting the highest rating for each item

Discussion

In this proof-of-principle study, we used publicly available data to correlate patient evaluations of a training site with student evaluations of the educational experiences at that site. We found a strong relationship between the patient rating of doctor–patient communication and student ratings of the clerkships at that site including organization, educational value, teaching, and evaluation and feedback. This study provides association data, and while proving the cause of these associations was beyond the scope of this preliminary study, it is interesting to speculate that similar skills are required for patient care and medical education, and that these skills underlie many of these associations. For example, effective communication skills are beneficial both for communicating diagnoses and treatment plans with patients8 and for engaging students in the clinical experience.9 Future research could delve into the associations we found to determine these causal roots, and findings could be used to improve faculty performance in both treating patients and teaching students.

Another area to explore is the role of medical students in the patient experience of care. Previous studies have found minimal effect of having a medical student present at the clinical encounter on patient satisfaction.10,11 Publicly available patient satisfaction data are widely available, and as we have demonstrated, can be linked to education data. Questions on the value of having medical students or other learners at a site might be able to be addressed using these data, including issues such as number, training level, frequency, and profession of the learners.

Our findings suggest a relationship between patient experiences and the perceived quality of education at that hospital, and have significant implications for the continued evaluation of the quality of medical education. We were able to link patient data from publicly available sources with medical education evaluation data. This approach is consistent with recent calls to use practice data to evaluate the effectiveness of medical education.15 Further studies linking clinical and education outcomes are needed to explore these relationships in more depth with the goal of determining cause and effect. The contributions of specific hospital locations, providers, or clerkships also need to be evaluated, particularly in the context of correlating patient outcomes with specific educational predictors and initiatives.

It is important to acknowledge the limitations of our study. We have only demonstrated associations and correlations and not causal proof. Also, the patient experience of care for each site is based on aggregate data over the year of study, and not all patients at the different sites interacted with a medical student. There may have been other aspects of the site that contributed to the students’ ratings that are important for us to define in future studies.

Conclusion

In conclusion, we have demonstrated proof-of-principle of linking publicly available patient evaluation of a hospital with student evaluation of the hospital as a clinical teaching site. The demonstrated associations between an important component of patient satisfaction, doctor-patient communication, with student evaluations of the clerkships at that site set the stage for further studies to determine how to improve both patient care and medical education.

Ethical approval

Ethical approval for this research was granted by the Institutional Review Board at the University of Minnesota.

Acknowledgments

The authors wish to thank the Medical Education Outcomes Center (MEOC) at the University of Minnesota for their support and inspiration.

Disclosure

M.E.R is an author and reviewer for UpToDate. The authors report no other conflicts of interest in this work.

References

1. Triola MM, Hawkins RE, Skochelak SE. The time is now: using graduates’ practice data to drive medical education reform. Acad Med. 2018;93(6):826–828. doi:10.1097/ACM.0000000000002176

2. Arora VM. Harnessing the power of big data to improve graduate medical education: big idea or bust? Acad Med. 2018;93(6):833–834. doi:10.1097/ACM.0000000000002209

3. Chahine S, Kulasegaram KM, Wright S, et al. A call to investigate the relationship between education and health outcomes using big data. Acad Med. 2018;93(6):829–832. doi:10.1097/ACM.0000000000002217

4. Weinstein DF. Optimizing GME by measuring its outcomes. N Engl J Med. 2017;377(21):2007–2009. doi:10.1056/NEJMp1711483

5. Ellaway RH, Topps D, Pusic M. Data, big and small: emerging challenges to medical education scholarship. Acad Med. 2019;94(1):31–36. doi:10.1097/ACM.0000000000002465

6. Da C, Da A, Sj D, Nk R, Mm T. Longitudinal research databases in medical education: facilitating the study of educational outcomes over time and across institutions. Acad Med. 2010;85(8):1340–1346. doi:10.1097/ACM.0b013e3181e5c050

7. MN Community Measurement. 2015 health care quality report. Available from: http://mncm.org/health-care-quality-report/. Accessed July 13, 2016.

8. Bartlett EE, Grayson M, Barker R, Levine DM, Golden A, Libber S. The effects of physician communications skills on patient satisfaction; recall, and adherence. J Chron Dis. 1984;37(9/10):755–764.

9. Sutkin G, Wagner E, Harris I, Schiffer R. What makes a good clinical teacher in medicine? A review of the literature. Acad Med. 2008;83:452–466. doi:10.1097/ACM.0b013e31816bee61

10. Mol SS, Peelen JH, Kuyvenhoven MM. Patients’ views on student participation in general practice consultations: a comprehensive review. Med Teach. 2011;33(7):e397–400. doi:10.3109/0142159X.2011.599894

11. Vaughn JL, Rickborn LR, Davis JA. Patients’ attitudes toward medical student participation across specialties: a systematic review. Teach Learn Med. 2015;27(3):245–253. doi:10.1080/10401334.2015.1044750

Creative Commons License © 2019 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.