Back to Journals » Vascular Health and Risk Management » Volume 16

Cardiology Fellow Diagnostic Accuracy and Data Interpretation Outcomes: A Review of the Current Literature

Authors Zhitny V , Iftekhar N, Alexander L, Ahsan C

Received 29 June 2020

Accepted for publication 29 September 2020

Published 19 October 2020 Volume 2020:16 Pages 429—435

DOI https://doi.org/10.2147/VHRM.S266510

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Daniel Duprez



Vladislav Zhitny,1 Noama Iftekhar,2 Luzviminda Alexander,1,3 Chowhdury Ahsan1,3

1School of Medicine, University of Nevada Las Vegas, Las Vegas, NV, USA; 2Stritch School of Medicine, Loyola University Chicago, Maywood, IL, USA; 3Department of Cardiology, University Medical Center of Southern Nevada, Las Vegas, NV, USA

Correspondence: Vladislav Zhitny Email [email protected]

Background: Cardiology fellows, in particular, are in a unique position to mold the new cardiovascular workforce, especially in terms of risk prevention. There is a growing need for the cardiovascular workforce. In the United States, one person dies every forty-two seconds due to a cardiovascular adverse event.
Methods: A PRISMA systematic review included comprehensive search of the MED-LINE database (PubMed) from 1927 to 2020 – the oldest to newest available literature on the subject available through PubMed.
Results: Fifty-seven cardiology fellows together interpreted a total of 1719 EKGs with a correct rate of 52%. Sixty-four fellows completed a total of 1363 echocardiography interpretations with an accuracy rate of nearly 75%.
Conclusion: Based on the studies discussed, it is evident that a cardiology fellow, particularly in their early years of training, may be limited due to a lack of experience. With continued EKG and echocardiogram interpretation, as well as other clinical skills practice, fellows can improve their diagnostic accuracy and procedural efficiency.

Keywords: cardiology fellow, cardiology outcomes, data interpretation, patient outcomes

Introduction

As of 2020, there are currently 1010 first-year positions available for the cardiology fellowship offered by 231 programs in the United States.1 Although the role of physicians in internal medicine has changed and expanded in many ways, many still choose to subspecialize. Cardiology fellowships, which are three years in length traditionally, follow a three- to four-year internal medicine residency program. Some choose to later pursue further specialization following completion of the cardiology fellowship program, including transplant cardiology, interventional cardiology, and electrophysiology.

There is a growing need for the cardiovascular workforce. In the United States, one person dies every forty-two seconds due to a cardiovascular adverse event, making it the leading cause of death in the United States.24 This number is projected to grow with the aging population, rising rates of obesity and other risk factors.35 Cardiology fellows, in particular, are in a unique position to mold the new cardiovascular workforce, especially in terms of risk prevention.6,7 A first-year cardiology fellow may not initially be directly involved in complex procedures, but they do function an important role in streamlining the process from the door to bedside.8 These include assessments of the patient’s history, electrocardiography (EKG), echocardiography, and hemodynamic status. Many programs have a fellow on-call during the weekends and nights to interpret EKGs, followed by an attending interpretation during the weekday. Thus, a cardiology fellow holds an important role in the interpretation and assessment of potentially life-threatening cardiovascular disease.

Despite the importance of the fellow, not only in their future roles but also as part of an efficient healthcare team, there is currently limited literature on assessment of fellows and the effectiveness of their delivery of care.8 Our paper conducts a systematic review of the literature regarding cardiology fellowships to assess for the value of the fellow and their effectiveness of care.

Methods

A PRISMA systematic review was implemented.9 Our study list included comprehensive search of the MED-LINE database (PubMed) from 1927 to 2020 – the oldest to newest available literature on the subject available through PubMed. A broad keyword search using “Cardiology Fellow” was implemented, yielding 959 articles. All articles presented were published in English language. The inclusion and exclusion criteria are shown in Figure 1.

Figure 1 Study flow chart.

Articles were selected for inclusion based on relevance following five eliminatory screens in accordance with PRISMA methods. The initial selection was based on the relevance of titles on the MED-LINE searches (Figure 1, step 1), the selected articles were then screened for duplicates (Figure 1, step 2). Following exclusion of duplicates, for further purpose of elimination, abstracts were read for the point of relevance (Figure 1, step 3). The final selection step included full reading of the previously screened publications to further narrow the study sample (Figure 1, step 4). Two researchers conducted the searches, and selected the obtained final sample. A consensus was reached for the selection of the final sample of individual articles (Table 1).

Table 1 Publications Included in the Systematic Review

After completion of the final articles, articles were divided into their topics of assessment. The two main categories included EKG accuracy and electrocardiography interpretation accuracy. Most articles did not differentiate between years of training, so it was elected to combine results for the various years in one categorical heading. Furthermore, certain articles included percent correct for individual EKGs or echocardiography assessed. For this, we calculated the mean for all the EKGs to represent the overall accuracy (Tables 2 and 3).

Table 2 Results of EKG Studies Use in Systematic Review

Table 3 Results of Echocardiogram Studies Use in Systematic Review

Results

Of the twelve articles that were finalized, three discussed EKG accuracy of fellows. Four discussed the accuracy of fellow-read echocardiography. One discussed murmur performance of cardiology fellows, one researched the diagnostic exposure of fluorescent tracing, one focused on cardiac catheterization complications by fellows, and one discussed interpretation of radiographic findings.

As exhibited in Table 2, there were 57 cardiology fellows who together interpreted a total of 1719 EKGs. The percentage of the correctly interpreted EKGs yielded about 52%. While the Carlson et al study did not report a number of fellows studied, the remaining yielded 64 cardiology fellows. From Table 3, fellows completed a total of 1363 echocardiography interpretations. This yielded an accuracy rate of nearly 75%.

Discussion

Our paper aimed to classify the expertise and training of the cardiology fellow through a systematic literature review.

Electrocardiogram (EKG) Interpretation

One area of assessment included electrocardiogram (EKG) interpretation. We chose to examine fellows in all areas of training, including those who elected to complete an electrophysiology sub-specialization to the cardiology fellowship. Researchers from the University of Michigan investigated the misdiagnosis rates of atrial fibrillation and atrial flutter.10 Certain characteristics, especially when atrial activity is obvious in multiple leads, result in the misdiagnosis of atrial fibrillation as flutter. Cardiology fellows, on average, were reported to correctly identify EKGs when the fibrillation waves were smaller. Prominent atrial activity produced comparable scores between cardiology fellows, cardiologists, and internists. Some limitations to this data set, however, include the relatively small selection of only EKGs tested and low response rate of 25% of cardiology fellows. Thus, due to these limitations, this may not be an accurate assessment of the skill set of a cardiology fellow. However, the high reported rates of accurate diagnosis in the EKG of non-prominent atrial activity (95%) and the atrial flutter EKG (90%) juxtaposed with the mean score of internists (63% and 84% respectively) and comparable to the cardiologists (95% and 92% respectively) are suggestive that cardiology fellows show high proficiency at EKG reading.

A study conducted by researchers from Switzerland and the Netherlands evaluated the performance of electrophysiology fellows and electrophysiologists in comparison to an automated algorithm in EKG interpretation of a ventricular arrhythmia.11 The fellows achieved a similar level of accuracy to the electrophysiologists with 72% and 73% respectively. This was lower than the reported accuracy of the automated algorithm (89%). However, the similarities between the electrophysiology fellow and the attending physician exhibit the value and knowledge of the fellow. Despite the advanced training associated with electrophysiology, we chose to examine this study due to the limited literature available and the fact that these fellows were indeed still cardiology trainees. The mean correct score by these fellows (72%) was similar to the score reported by the University of Michigan by their cardiology fellow trainees, nearly 73%.

Another study completed in the Czech Republic examined the accuracy of EKG interpretation by cardiology fellows and other internal medicine fellows.12 Fellows were given a selected sample of 100 EKGs for diagnosis. The rate of correct diagnoses was 48.9% for cardiology fellows when compared to non-cardiology fellows who reported 35.9% correct. The cardiology fellows had a mean percent of correct and nearly correct diagnoses of 70.1%, which fell in a similar range as the two studies listed previously. Furthermore, cardiology fellows had a higher rate of correct and nearly correct diagnoses than non-cardiology fellows who reported a 55.0% of correct diagnoses. However, the paper determined that there was still a lack of proficient skills among fellows due to the relatively low “purely” correct interpretation rate. It is important to note that these fellows were part of a junior fellowship program in the Czech Republic. Different standards and educational exposure to EKGs may be accountable for the variation in the reported scores between the studies, as well as possible length of training differences. The low percentage correct from this study reduced our mean overall score for the EKG interpretation by fellows (Table 2).

Echocardiography Interpretation

Another area of assessment included echocardiography interpretation (Table 3). A study of fellows in Toronto demonstrated a similar lack of proficiency but in echocardiography scores.13 The study reported that less than 50% of fellows achieved a passing score of 60% for echocardiography interpretation. The mean reported score for first year fellows was 47%, and 59% for those who were second year and above. The study also demonstrated that a higher number of reported scans equated to a higher score on the assessment.

Yang et al investigated the diagnostic accuracy of strain imaging in the setting of dobutamine stress echocardiography.14 Limited by the comparison of only two fellows, the study reported that fellows had lower accuracy and specificity when compared to attending physicians. A study completed in Seattle suggested that in 292 echocardiograms, the major discrepancies between the fellow and attending cardiologist were few (7 out of 292).15 Minor discrepancies were reportedly more common at 14.4% (42 out of 292). The examined accuracy in the third year of fellowship, however, was 100%. There were no reported discrepancies. The first- and second-year fellows, however, reported an 82% accuracy. While there were limitations to this study, including fewer third year fellows and attending interpretation cited as the gold standard, it does exhibit that there are limited discrepancies between the interpretations. Furthermore, these researchers reflected that greater experience (ie, the third year) allowed for more accurate readings. Similarly, a study completed at Beth Israel demonstrated that in transthoracic echocardiogram interpretation there was a nearly 20% discrepancy rate (4.1% major and 17.4% minor) between cardiologists and the first-year cardiology fellows.16

Clinical Skills

Other categories assessed included rotational and clinical competencies. A study completed in Italy demonstrated that cardiology fellows when completing invasive cardiology procedures as part of their rotational experience demonstrated a statistically significant increased radiation exposure to the patients – fluoroscopy time was increased by 38% and Area-Kerma-Area Product (KAP) for fluoroscopy was increased by 45%.17 However, this can be explained by the learning curve as a fellow, particularly when someone has not elected to subspecialize in interventional cardiology. The increased times can explain the increased exposure and contrast used, and unfortunately, this is part of the learning process in any field. Experience leads to shorter times and more efficient procedures.

In terms of cardiac examination skills, cardiology fellows, when compared to medical students, internal medicine residents, family medicine residents, faculty, clinical faculty, and private practitioners, demonstrated statistically significant (P < 0.001) scores in a cardiac computer examination.18

The University of Virginia examined care of STEMI patients by on-call fellow coverage of a cardiac care unit from 2009 to 2013.8 After a change was made such that a night float fellow remained on call until the morning, there was twenty-four-hour coverage by a cardiology fellow. This change resulted in a decrease in door-to-balloon times from 72 minutes to 49 minutes. There were no differences observed in the other parameters: symptom onset to arrival, arterial access to the first device, in-hospital mortality, or door-to-balloon times during regular hours. There were limitations to the study, including the fact that this was a single center study, was underpowered due to small sample size, and that during this time period a femoral access was used in a majority (94%) of cases. The researchers suggested that there may have been a decrease in door-to-balloon times during the off-hours due to their role in continuous patient assessment and surgical preparation, including consents and transports.

Unfortunately, due to limitations in the data collected, we grouped fellows – regardless of training level – into a single category for analysis. Furthermore, most of the studies listed were single center. There were also variations in the gold standard: most studies reported an attending physician’s assessment as the gold standard while others relied on a predetermined consensus of the EKGs selected in terms of an assessment.

A potential way of improving clinical evaluation could be with the use of more refined techniques for the gold standard. Such as coronary computed tomography angiography (CTA) based approaches including transluminal attenuation gradient (TAG), CT vasodilator induced stress myocardial perfusion imaging, and fractional flow reserve CT (FFRCT)19 as well as the use of artificial intelligence.20 Techniques such as FFRCT have shown improved diagnostic accuracies of calcifications, traditionally listed as unequivocal by other techniques.2123 Likewise, articles published in The Lancet and Neural Networks have shown comparable results of artificial intelligence when interpreting fractional flow reserve (FFR) directly through CT angiography images or artificial intelligence interpretation detecting electrocardiographic presentation of atrial fibrillation on a standard 12 lead ECG.20,24

Conclusion

Based on the studies discussed, it is evident that a cardiology fellow, particularly in their early years of training, may be limited due to a lack of experience. With continued EKG and echocardiogram interpretation, as well as other clinical skills practice, fellows can improve their diagnostic accuracy and procedural efficiency. However, another possibility may be due to a lack of standardized use of a gold standard assessment. A future investigation with the use of consistent gold standard assessment in differing regions, such as artificial intelligence or use of a superior standardized technique, would be a noteworthy consideration for the future investigators.

Disclosure

The authors report no conflicts of interest for this work.

References

1. Home. The match, national resident matching program. Available from: http://www.nrmp.org/. Accessed April 16, 2020.

2. Mozaffarian D, Benjamin EJ, Go AS, et al. Heart disease and stroke statistics—2016 update. Circulation. 2016;133(4). doi:10.1161/cir.0000000000000350.

3. Mensah GA, Brown DW. An overview of cardiovascular disease burden in the United States. Health Aff. 2007;26(1):38–48. doi:10.1377/hlthaff.26.1.38.

4. Akil L, Ahmad HA. Relationships between obesity and cardiovascular diseases in four southern states and Colorado. J Health Care Poor Underserved. 2011;22(4A):61–72. doi:10.1353/hpu.2011.0166

5. Pagidipati NJ, Gaziano TA. Estimating deaths from cardiovascular disease: a review of global methodologies of mortality measurement. Circulation. 2013;127(6):749–756. doi:10.1161/circulationaha.112.128413.

6. Lavie CJ, Milani RV, Ventura HO. Obesity and cardiovascular disease: risk factor, paradox and impact of weight loss. J Am Coll Cardiol. 2009;53(21):1925–1932. doi:10.1016/j.jacc.2008.12.068

7. Chen Y, Freedman ND, Albert PS, et al. Association of cardiovascular disease with premature mortality in the United States. JAMA Cardiol. 2019;4(12):1230. doi:10.1001/jamacardio.2019.3891.

8. Kohan L, Nagarajan V, Millard M, Loguidice M, Fauber N, Keeley E. Impact of around-the-clock in-house cardiology fellow coverage on door-to-balloon time in an academic medical center. Vasc Health Risk Manag. 2017;13(April):139–142. doi:10.2147/vhrm.s132405.

9. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097. doi:10.1371/journal.pmed.1000097.

10. Knight BP, Gregory FM, Adam Strickberger S, Morady F. Electrocardiographic differentiation of atrial flutter from atrial fibrillation by physicians. J Electrocardiol. 1999;32(4):315–319. doi:10.1016/s0022-0736(99)90002-x.

11. Asatryan B, Ebrahimi R, Strebel I, et al. Man vs machine: performance of manual vs automated electrocardiogram analysis for predicting the chamber of origin of idiopathic ventricular arrhythmia. J Cardiovasc Electrophysiol. 2019;31(2):410–416. doi:10.1111/jce.14320.

12. Novotny T, Bond RR, Andrsova I, et al. Data analysis of diagnostic accuracies in 12-lead electrocardiogram interpretation by junior medical fellows. J Electrocardiol. 2015;48(6):988–994. doi:10.1016/j.jelectrocard.2015.08.023.

13. Nair P, Siu SC, Sloggett CE, Biclar L, Sidhu RS, Eric HC. The assessment of technical and interpretative proficiency in echocardiography. J Am Society Echocardiograph. 2006;19(7):924–931. doi:10.1016/j.echo.2006.01.015.

14. Yang LT, Kado Y, Nagata Y, Otani K, Otsuji Y, Takeuchi M. Strain imaging with a bull’s-eye map for detecting significant coronary stenosis during dobutamine stress echocardiography. J Am Society Echocardiograph. 2017;30(2):159–167.e1. doi:10.1016/j.echo.2016.10.011.

15. Carlson S, Kearney K, Li S, Fujioka M, Schwaegler B, Kirkpatrick JN. Preliminary interpretations of transthoracic echocardiograms by cardiology fellows. J Am Society Echocardiograph. 2017;30(12):1234–1238. doi:10.1016/j.echo.2017.07.014.

16. Spahillari A, McCormick I, Yang JX, Quinn GR, Manning WJ. On-call transthoracic echocardiographic interpretation by first year cardiology fellows: comparison with attending cardiologists. BMC Med Educ. 2019;19(1). doi:10.1186/s12909-019-1634-7.

17. Bernardi G, Padovani R, Trianni A, et al. The effect of fellows’ training in invasive cardiology on radiological exposure of patients. Radiat Prot Dosimetry. 2007;128(1):72–76. doi:10.1093/rpd/ncm230.

18. Vukanovic-Criley JM, Criley S, Warde CM, et al. Competency in cardiac examination skills in medical students, trainees, physicians, and faculty. Arch Intern Med. 2006;166(6):610. doi:10.1001/archinte.166.6.610.

19. Sevag Packard R, Karlsberg R. Integrating FFR CT into routine clinical practice. J Am Coll Cardiol. 2016;68(5):446–449. doi:10.1016/j.jacc.2016.05.056

20. Gao Z, Wang X, Sun S, et al. Learning physical properties in complex visual scenes: an intelligent machine for perceiving blood flow dynamics from static CT angiography imaging. Neural Netw. 2020;123:82–93. doi:10.1016/j.neunet.2019.11.017

21. Min J, Koo B, Erglis A, et al. Effect of image quality on diagnostic accuracy of noninvasive fractional flow reserve: results from the prospective multicenter international DISCOVER-FLOW study. J Cardiovasc Comput Tomogr. 2012;6(3):191–199. doi:10.1016/j.jcct.2012.04.010

22. Nørgaard B, Gaur S, Leipsic J, et al. Influence of coronary calcification on the diagnostic performance of CT angiography derived FFR in coronary artery disease. JACC Cardiovasc Imaging. 2015;8(9):1045–1055.

23. Packard R, Li D, Budoff M, Karlsberg R. Fractional flow reserve by computerized tomography and subsequent coronary revascularization. Eur Heart J. 2016;18(2):145–152. doi:10.1093/ehjci/jew148

24. Attia Z, Noseworthy P, Lopez-Jimenez F, et al. An artificial intelligence-enabled ECG algorithm for the identification of patients with atrial fibrillation during sinus rhythm: a retrospective analysis of outcome prediction. Lancet. 2019;394(10201):861–867. doi:10.1016/S0140-6736(19)31721-0

Creative Commons License © 2020 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.