Back to Journals » Advances in Medical Education and Practice » Volume 14

Cognitive Competence and Curriculum Development in Nurse Anesthesia Education: A Pilot Study

Authors Swerdlow B , Osborne-Smith L, Arditti D, Hatfield LJ

Received 8 February 2023

Accepted for publication 13 June 2023

Published 19 June 2023 Volume 2023:14 Pages 627—635

DOI https://doi.org/10.2147/AMEP.S407737

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder



Barry Swerdlow,1 Lisa Osborne-Smith,1,2 Douglas Arditti,1,2 Lisa J Hatfield3

1Nurse Anesthesia Program, Oregon Health & Science University, Portland, OR, USA; 2Department of Anesthesiology and Perioperative Medicine, Oregon Health & Science University, Portland, OR, USA; 3Teaching and Learning Center, Oregon Health & Science University, Portland, OR, USA

Correspondence: Barry Swerdlow, Oregon Health & Science University, SON 521, 3455 SW US Veterans Hospital Road, Portland, OR, 97239, USA, Tel +1 503 494 6468, Fax +1 503 346 8296, Email [email protected]

Background: Advanced practice nursing in the United States is shifting toward doctoral certification, most commonly a Doctor of Nursing Practice degree. However, there is limited evidence that this transition improves clinical competence.
Purpose: The aim of this study was to determine whether modifications in a nurse anesthesia curriculum that transitioned from a Master of Nursing to a Doctor of Nursing Practice program were associated with improved cognitive performance using an oral examination.
Design: A prospective, comparative observational study of students from a single, university-based nurse anesthesia program.
Methods: This study was a small-scale investigation (n = 22) that used a quantitative method to compare the performances of consecutive cohorts of Master of Nursing and Doctor of Nursing Practice nurse anesthesia students as rated by oral examinations designed to evaluate critical thinking skills and previously shown to demonstrate internal consistency and reliability.
Results: After completing an expanded curriculum, Doctor of Nursing Practice nurse anesthesia students performed significantly better than Master of Nursing students on oral examination, with improvements in cognitive domains previously identified as areas of underperformance by Master of Nursing students.
Conclusion: Targeted curricular additions in a Doctor of Nursing Practice program correlated with improvements in nurse anesthesia student cognitive competence as measured by oral examination.

Keywords: advanced practice nursing education, Doctor of Nursing Practice, graduate nursing education, nurse anesthesia, cognitive competence

Introduction

In recent years, although there has been significant development of master’s level education of advanced practice registered nurses (APRNs) in many countries, practice-focused doctoral education of these practitioners, including nurse anesthetists, has become increasingly commonplace in the United States.1,2 These changes occurred largely because of the position of the American Association of Colleges of Nursing (AACN) and the subsequent recommendations of the American Association of Nurse Anesthesiology (AANA) and the Council on Accreditation of Nurse Anesthesia Educational Programs (COA) that a doctoral degree should become the entry level to practice by 2025.3,4 The benefits for doctoral education identified by the AACN included the need for “advanced competencies for increasingly complex clinical” roles and for “enhanced knowledge to improve nursing practice and patient outcomes”.4 One aspect of the DNP mandate in APRN training focuses on the development of skills needed to translate evidence-based care to practice and to transform health care systems quality and safety.4,5 These primary justifications for practice-focused doctoral education are based on reports from the Institute of Medicine identifying a need to improve health professionals’ performances.4

Notwithstanding ongoing changes in certification options predicated on this rationale, there has been little published evidence that Doctor of Nursing Practice (DNP; or equivalent doctoral degree) preparation improves the quality of clinical care delivered by APRNs.6 For example, recent survey data indicate that graduates of DNP programs outside academia largely provide direct patient care and that employers in this setting view these graduates as equivalent to APRNs with master’s degrees.6 Nevertheless, these employers noted that DNP graduates had better assessment and collaboration skills, and superior appreciation for evidence-based clinical guidelines with an improved ability to translate such evidence into clinical routines.1,6 This latter observation is consistent with self-reported competencies of DNP-prepared nurses in practice.7 However, despite a need for more data to address the value of doctoral APRN education – except for descriptive findings from surveys – it remains unclear that there are differences in the clinical competencies of DNP versus Master of Nursing (MN) trained APRNs.1

As such, it is unclear whether nurse anesthesia educational systems redesigned as DNP programs improve the clinical competence of their graduate trainees compared with previous MN or Master of Science in Nursing (MSN) programs. In general, assessment of clinical competence by nurse anesthesia programs (NAPs) and other APRN programs has been challenging.8,9 Although the purpose of the National Certifying Examination administered by the National Board of Certification and Recertification of Nurse Anesthetists is to ensure clinical competence and thereby promote patient safety,10 these examinations utilize select-response questions that only assess knowledge base, the equivalent of the first level of Miller’s Pyramid of Assessment – “knows”.11 To address the question of whether an examinee “knows how”, examinations need to document what students will do when they encounter patient situations. In NAPs, this performance analysis is often fulfilled by clinical instructors’ observations of students during perioperative encounters, but such determinations are commonly related to the accuracy of diagnosis and specific management decisions rather than to the associated reasoning process.11

In contrast, oral examinations provide a means to precisely test abstract reasoning and critical thinking skills.12 Simulation-based examination can provide similar cognitive evaluation but there are complexities to testing critical thinking in a high-fidelity environment where the primary goal is to “show how” rather than to demonstrate “know(s) how”.13,14 Recently, an oral examination strategy modeled on the American Board of Anesthesiology Standardized Oral Examination (mock oral board examination or MOBE)12 has been successfully employed for this purpose in a cohort of MN nurse anesthesia students.15 As a result, oral examinations were added as a benchmark in this nurse anesthesia program’s new DNP curriculum.

This institution’s use of oral examinations in both the previous MN and current DNP curricula offered a unique opportunity to compare outcomes related to the curricular change. The design of this study was a consequence of the natural experiment resulting from this transition. The present study was designed to evaluate performances on MOBE in consecutive cohorts of MN NAP and DNP NAP students. The MOBEs were conducted at similar junctures in the students’ training, including similar cumulative clinical hours, and the study was formulated to assess whether innovative curricular modifications associated with the development of practice-focused doctoral education for NAPs positively impacts the cognitive competence of trainees.

Methods

The primary question addressed by this research study was whether targeted changes in the didactic curriculum implemented as part of the transition from an MN to a DNP program at a single institution affected the cognitive competence of nurse anesthesia students. The investigation received approval by the Oregon Health & Science (OHSU) Institutional Review Board. Participation in the investigation was voluntary; all participants were treated with confidentiality; and all participants signed consent forms. Because this study was a small-scale convenient sample (n =22), its findings should be considered preliminary, and further investigation is warranted to confirm validity.

The study was conducted in two parts: first with the program’s final MN cohort and then with its initial DNP cohort. MOBEs employed with the two cohorts were nearly identical, with only minor modifications, some of which were related to health and safety considerations stemming from the coronavirus disease of 2019 (COVID-19) pandemic. The MN cohort study has been previously reported;15 therefore, the description of methods is limited, and methodological differences between the MN and DNP MOBE studies are highlighted.

Commonalities Between the MN and DNP MOBEs

All students in both the MN and DNP cohorts agreed to participate in the study. No student in either cohort had taken an oral examination previously. The examinations occurred at a similar point in the graduate curricula, at the end of the anesthesia specialty didactic content for both cohorts. The format for administering the examination was the same for both cohorts: a clinical scenario “stem” provided immediately prior to the examination, followed by a series of questions for 30 minutes related to perioperative management of a hypothetical case based on that “stem”. In both cohorts, each MOBE was used twice (see below). This required development of an additional MOBE for the larger DNP cohort: five of the six DNP MOBEs were identical to the five MN MOBEs. The additional DNP MOBE, written by the same NAP faculty member who wrote the other five MOBEs, contained comparable content. The same examiner conducted the oral examinations for all students in both cohorts and, in both the MN and DNP examinations, this examiner was the only person to directly interact with the student examinees. Both MOBEs were rated by three faculty members (including the examiner). The examiner/rater and one of the two non-examiner raters were identical in the DNP and MN MOBEs.

The examinations in both cohorts were assessed using an identical Scoring Rubric (Table 1). This rubric had three domains: Clinical Analysis, Anesthesia Knowledge, and Communication Skills, and within each domain there were three assessments (“subsets”) labeled A, B, and C. The ratings for each subset were needs improvement (1 point), marginal pass (2 points), or pass (3 points), for a total of 27 possible points per MOBE. For raters, depending on the nature of the subset task, needs improvement was defined as performing the task poorly or < 50% of the time; marginal pass was defined as performing the task adequately or between 50–100% of the time; and pass was defined as performing the task well or 100% of the time. Pre-examination rater consensus review of the examinations was performed in the same manner in each of the two examinations, and reconciled scores shared with students in a post-examination debrief.15

Table 1 Scoring Rubric. Scores of 3 (Pass), 2 (Marginal Pass), or 1 (Needs Improvement) Were Assigned to Each Student in Each Domain Subset. See Text for Details.

Differences Between the MN and DNP MOBEs

Successive NAP cohorts tend to be reasonably homogeneous consisting of individuals who have made a common professional decision after similar training at similar stages in their lives. As such, it is not surprising that the MN and DNP MOBE cohorts had similar demographics. Despite these commonalities, there were some differences in the exam designs that are summarized in Table 2. MOBEs were administered at a comparable time in each program (after completion of basic and advanced principles of anesthesia) but the two curricula differed in length, and the DNP curriculum included foundational DNP courses that were not part of the MN program. The MN NAP cohort consisted of 10 students who had completed 14 months of their course of study (total 27 months) including approximately 880 clinical hours. The DNP NAP cohort consisted of 12 students who had finished 20 months of their course of study (total 36 months) that included approximately 550 clinical hours. The DNP cohort had less in-person high-fidelity simulation experiences compared with the MN cohort because of COVID-19. At the time of their oral examination, the DNP students had completed the foundational DNP courses shared by all university DNP APRN programs (including Ethics, Informatics, Roles, Critical Appraisal of Evidence, Policy and Population Health, Improvement Science, Economics and Finance, and Leadership) and two new courses that were added to the NAP DNP curriculum: “Selected Topics in Pathophysiology” and “Anesthesia and Co-Existing Diseases”.

Table 2 Features of Master of Nursing and Doctor of Nursing Practice Cohorts and Mock Oral Board Examinations

Because of the increased size of the DNP cohort, their examinations were conducted over two consecutive days and involved six (rather than five) different scenarios. The same MOBE was repeated for consecutive DNP students to avoid the possibility of sharing information, whereas five examinations were repeated between same day morning and afternoon sessions for the MN cohort. Face masks were worn by all individuals for the DNP MOBEs during the COVID-19 pandemic. DNP MOBEs were rated by three nurse anesthesia faculty members, compared with the MN MOBEs that were rated by two nurse anesthesia faculty members and a faculty member from the OHSU Teaching and Learning Center (TLC).

In addition, MN MOBE performances did not affect course grades, and they were not used for benchmark purposes, but they were conducted in front of a student audience. For the DNP cohort, examination scores were part of course grades, and performance on the MOBE served a benchmark function. As such, unlike the MN MOBE, failure of the DNP MOBE was associated with significant academic consequences. The latter fact also made it necessary for the DNP examinations to be conducted without a student audience. In contrast with the MN MOBEs, the DNP cohort examinations were not recorded, and after the raters conferred, the scores were immediately shared with students.

Data Analysis

Responses from the DNP MOBE were compared with the corresponding findings from the MN MOBE.15 Specifically, the following scores were compared: (a) MOBE total scores; (b) MOBE domain scores (Clinical Analysis, Anesthesia Knowledge, and Communication); and (c) MOBE domain subset scores. Reconciled scores, which were determined by group discussion by all three raters immediately upon completion of each exam, served as best approximations to accurate scores, and were used for all comparisons of rubric ratings. IBM SPSS version 29 was used for data analysis. Since the sample size was small, Shapiro–Wilk tests were performed on all variables and there was evidence of non-normality in some of the subset scores. Therefore, we tested differences using the Kruskal–Wallis test to decide whether the population distributions were identical without assuming normality.

Results

Master of Nursing (MN) and Doctor of Nursing Practice (DNP) MOBE scores are presented in Table 3 and Table 4. While all students successfully passed the examination (the “passing” rate – requiring an overall mean score ≥ 2.0 – for both the DNP and MN cohorts using reconciled scores was 100%), DNP students performed substantially better than MN students in the domain of Clinical Analysis and most of the domain subsets of Anesthesia Knowledge. Student performances in domain subsets IA (postponement of surgery based on sound judgment), IB (formulation of differential diagnoses), IC (troubleshooting intraoperative problems), IIA (defines the specifics of appropriate preoperative evaluations), and IIC (choice of appropriate anesthetic management based on an understanding of pathophysiology) were significantly higher in the DNP cohort relative to the MN cohort. In contrast, there was no significant difference in performances in the Anesthesia Knowledge domain subset IIB (appropriate choice of intraoperative monitors and anesthesia methodology), and both cohorts performed equally well in the domain of Communication. The median total score in the DNP cohort was 25.00 (IQR = 1.25), whereas the median in the MN cohort was 21.00 (IQR = 1.75). The Kruskall-Wallis test showed that the difference was significant (p < 0.001).

Table 3 Comparison of Domain Scores and Total Scores of Master of Nursing and Doctor of Nursing Practice Cohorts

Table 4 Comparison of Domain Subset Scores of Master of Nursing and Doctor of Nursing Practice Cohorts. For the Meaning of Each Domain Subset (IA, IB, etc.), See Table 1.

Discussion

Ensuring competence of graduate trainees remains a paramount goal of all US APRN programs including NAPs awarding Doctor of Nursing Practice (DNP or DrNP), Doctor of Nurse Anesthesia Practice (DNAP), and Doctor of Management Practice in Nurse Anesthesia (DMPNA) degrees.16 The need for such competence in an increasingly complex health care system resulting from the “burgeoning growth”4 of science and technology provided the foundation for the original AACN Position Statement on the Practice Doctorate in Nursing, and it was the perception that additional training would enhance patient outcomes that first defined the potential benefits of such programs.4 Because oral examination assesses domains that correlate with clinical performance,12,17 this form of evaluation may be useful not only as a benchmark prior to student immersion in clinical rotations, but it also may serve to test whether the additional educational opportunities available in a practice doctorate education enhance student competence and patient safety. This concept is applicable to all APRN specialties, not just NAPs. As such, oral examination provides a means to test the hypothesis that such programs truly enhance patient-centric nursing practice.

DNP MOBE versus MN MOBE Performance Results

The most noteworthy finding of this study was that DNP students, after completing targeted additions to their curriculum, performed significantly better than recent MN students on their MOBEs in areas testing clinical analysis and anesthesia knowledge. This observation is important because, although both MN and DNP cohorts achieved passing ratings, cognitive competence represents a continuum (as does clinical competence in general) with improved performance beyond a “pass” threshold having tangible value. In contrast with these areas of testing, both cohorts performed equally well in IIB (anesthesia knowledge related to appropriate choice of monitors and anesthesia methodology) and the domain of Communication. This latter evaluative section was included because of the vital role of communication in ensuring perioperative patient safety by anesthesia providers.18

Because the major distinction between these cohorts related to differing didactic curricula (cohorts had similar demographic and professional backgrounds; they were separated chronologically by one year in the same institution with identical instructors), outcome differences most likely related to differences in their curricular preparation. Furthermore, relative increases in the mean test scores of DNP students were most notable for the three areas of maximum underperformance by the MN cohort – domain subsets IB (clinical analysis: formulating differential diagnoses), IC (clinical analysis: intraoperative troubleshooting), and IIC (anesthesia knowledge: physiology and pathophysiology) (Table 4) – and improvements in these cognitive domains represented the expressed focus of changes implemented in the DNP curriculum (although it is possible that other changes that occurred as part of implementation of a doctoral program were responsible for students’ improved scores or at least contributed to this outcome). These results were significant despite the small size of the study cohorts and suggest that cognitive competence issues following completion of a didactic and simulation MN course of study (as identified by oral examination) can be effectively addressed by curricular modifications instituted as part of a robust DNP program.

Targeted changes in curriculum were made possible by an expanded DNP program (27 months vs 36 months) that allowed two new courses to be introduced for improving clinical analysis and anesthesia knowledge in specific areas. Time constraints in the MN program did not allow for these courses, and there were fewer opportunities for repetition of concepts throughout that course of study. The Selected Topics in Pathophysiology course was designed to enhance students’ understanding of how disease processes relate to perioperative clinical considerations. During the Anesthesia and Co-Existing Disease course, students repeatedly applied abstract reasoning to common adverse perioperative events, and presented this information in an organized, oral format – a skill that requires practice and is critical to professional development.

Teaching students in this manner to employ metacognitive approaches (directing students to think about what they are thinking including recognizing when they do not understand something) can be a powerful tool for learners and may play an important role in preventing errors in formulating differential diagnoses by monitoring and regulating reasoning.19–21 A longer DNP program of study permitted effective curricular expansion, with the development of skills necessary for superior performance on the examination, including both critical thinking abilities and mental processing related to effective articulation of answers. Such cognitive competence represents an essential component of clinical competence,22 and similar expansion of the course of study in other APRN DNP programs has been suggested to improve safe patient care compared with MN-prepared graduates.1

In interpreting the implications of this comparison, it is important to consider that the scores for the DNP cohort were linked to course grades while student performances on the MN MOBE were used for feedback purposes only and were not associated with institutional consequences. Also, while the MN examinations included a student audience that may have created a significant source of stress, the social aspect of stress is not identical to academic consequences. Hence, it is possible that improved ratings in the DNP cohort may relate to the additional academic incentive that was part of that exam process,23 but the most notable improvements in the DNP cohort occurred in those cognitive domains targeted by changes in their modified course of study. This finding suggests that a significant contribution to their superior oral examination performance relates to those curricular modifications.

MOBE as a Benchmark Evaluation in a NAP

Another finding of this study concerns the ability of MOBE to function as a benchmark evaluation at a critical juncture in nurse anesthesia training, namely just before transition from classroom teaching into clinical practice. The MOBE in this study was designed to meet the specifications of a good benchmark evaluation. These included selection of performance indicators that were (a) essential to professional success, (b) both qualitative and quantitative in nature, and (c) reproducible, to enable comparison with new performance occurring after initiatives arising from benchmarking had been implemented.24 The scoring rubric in MOBE evaluated cognitive domains critical to competent professional conduct: clinical analysis, anesthesia knowledge, and communication skills. Many of the domain subsets related to generation of precompiled responses and abstract reasoning associated with perioperative adverse event management, critical elements in dynamic decision making that are essential to safe anesthesia.18 The Scoring Rubric contained a mix of parameters that were amenable to quantitative scoring (for example, choice of appropriate monitors) and qualitative scoring (for example, communication skills). Furthermore, the performance indicators employed by MOBE could reproducibly be re-evaluated to enable comparison between different cohorts, as demonstrated by the present study’s comparison of DNP and MN student cohorts.

Benchmark examinations should be both summative and formative24 – they not only provide data on performance, but also, they are designed for quality enhancement. By highlighting areas needing improvement, benchmark examinations are useful to define educational targets and objectives and allow discovery of approaches to ensure future excellence.24 The MOBE used in this study originally was designed for precisely these purposes and successfully identified three areas of underperformance involving critical thinking in MN NAP students.15 As a result, modifications were implemented in the new DNP NAP curriculum designed to address these areas of performance, and the significantly improved performance of the current DNP cohort with MOBE likely represents a validation of its formative function.

Limitations

An important limitation of this investigation relates to the small sizes of the cohorts. On the other hand, significant differences in performance between these two cohorts were clear despite these small numbers – and the differences occurred most notably in “targeted” cognitive domains. This latter finding suggests that our conclusions likely have validity despite this limitation. Confounding variables in this study included (a) minor differences between the MOBE processes (Table 2), and (b) somewhat differing learning experiences for the two study cohorts (in addition to modifications in the DNP curriculum that amounted to 6 months of additional preclinical education). Namely, the DNP cohort received its final one year of didactic instruction almost entirely online due to the COVID-19 pandemic and, compared with the MN cohort at the time of the oral examination, they had completed less high-fidelity simulation training and less clinical training (550 versus 880 hours) for the same reason. These factors, however, do not explain the performance improvements of the DNP cohort relative to the MN cohort. Likewise, it is doubtful that minor differences in study methodology biased outcomes significantly.

Conclusion

In this study, the results of oral examinations provided the first clear evidence of a difference in MN-prepared and DNP-prepared students’ cognitive competence: the current investigation suggests that curricular modifications associated with a transition between these two programs of study within one institution can result in improved oral examination outcomes that assess student reasoning processes and correlate with enhanced clinical performance.12,25,26 Because these data derive from a single program involving identical didactic and simulation instructors with back-to-back cohorts (conditions that also restricted cohort sizes, a significant limitation of this study), these improved outcomes likely are due to the associated curricular changes. The investigation’s findings suggest that DNP educational processes may provide more effectively for trainees who not only “know” data but also “know how” to apply such data in clinical practice, and therefore support the ultimate goals of APRN programs transitioning to doctoral practice models. Additional questions that merit exploration relate to whether the observed improvements in cognitive performance during nurse anesthesia DNP MOBEs persist once students enter the full-time clinical phase of their training and thereafter: Do MN and DNP-trained CRNAs differ in terms of clinical performance and patient outcomes? Furthermore, while there are survey findings and self-reported data to suggest that DNP-prepared APRNs have an improved ability to translate evidence-based guidelines into clinical practice,1,6,7 additional studies of whether such translational innovation is more common in DNP CRNAs versus MN CRNAs represents a useful related line of inquiry. Lastly, the utility of MOBEs as evaluative techniques in other APRN specialties deserves investigation, where oral examinations potentially can provide both formative and summative analyses in a manner similar to NAPs.

Disclosure

The authors report no conflicts of interest in this work.

References

1. McCauley LA, Broome ME, Frazier L, et al. Doctor of Nursing Practice (DNP) degree in the United States: reflecting, readjusting, and getting back on track. Nurs Outlook. 2020;68(4):494–503. doi:10.1016/j.outlook.2020.03.008

2. Rosa W, Fitzgerald M, Davis S, et al. Leveraging nurse practitioner capacities to achieve global health for all: COVID-19 and beyond. Int Nurs Rev. 2020;67(4):554–559.

3. Council on Accreditation of Nurse Anesthesia Educational Programs. Position statements. Position statement on doctoral education for nurse anesthetists; 2021. Available from: https://www.coacrna.org/about-coa/position-statements/. Accessed February 6, 2023.

4. American Association of Colleges of Nursing. AACN position statement on the practice doctorate in nursing; 2004. Available from: https://www.aacnnursing.org/DNP/Position-Statement. Accessed February 6, 2023.

5. Starnes-Ott K, Arnaud M, Rooney L, Lewis M. Using complex adaptive theory to guide the transition to DNP nurse anesthesia education. J Prof Nurs. 2020;36(3):123–127. doi:10.1016/j.profnurs.2019.10.004

6. Beeber AS, Palmer C, Waldrop J, Lynn MR, Jones CB. The role of Doctor of Nursing Practice-prepared nurses in practice settings. Nurs Outlook. 2019;67(4):354–364. doi:10.1016/j.outlook.2019.02.006

7. Kesten KS, Moran K, Beebe SL, et al. Drivers for seeking the Doctor of Nursing practice degree and competencies acquired as reported by nurses in practice. J Am Assoc Nurs Pract. 2021;34(1):70–78. doi:10.1097/JXX.0000000000000593

8. Clifford T. Competency assessment. J Perianesth Nurs. 2020;35(2):222–223.

9. Kesten KS, Brown HF, Meeker MC. Assessment of APRN student competency using simulation: a pilot study. Nurs Educ Perspect. 2015;36(5):332–334. doi:10.5480/15-1649

10. National Board of Certification and Recertification of Nurse Anesthetists. NBCRNA certification; 2020. Available from: https://www.nbcrna.com/initial-certification. Accessed February 6, 2023.

11. Miller G. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9):S63–67. doi:10.1097/00001888-199009000-00045

12. Sun H, Warner DO, Patterson AJ, et al. The American Board of Anesthesiology’s standardized oral examination for initial board certification. Anes Analg. 2019;129(5):1394–1400. doi:10.1213/ANE.0000000000004263

13. Miller C, Serkan T, Schwengel D, Isaac G, Schiavi A. Development of a simulated objective structured clinical exam for the APPLIED Certification Exam in Anesthesiology: a two-year experience informed by feedback from exam candidates. J Educ Periop Med. 2019;21(4):E633.

14. Witheridge A, Ferns G, Scott-Smith W. Revisiting Miller’s pyramid in medical education: the gap between traditional assessment and diagnostic reasoning. Int J Med Educ. 2019;10:191192. doi:10.5116/ijme.5d9b.0c37

15. Swerdlow B, Osborne-Smith L, Hatfield LJ, Korin TL, Jacobs SK. Mock oral board examination in nurse anesthesia education. J Nurs Educ. 2021;60(4):229–234. doi:10.3928/0148483420210322-09

16. Hawkins R, Nezat G. Doctoral education: which degree to pursue? AANA J. 2009;77(2):92–96.

17. Wang T, Sun H, Zhou Y, et al. Construct validation of the American Board of Anesthesiology’s APPLIED examination for initial certification. Anes Analg. 2021;133(1):226–232. doi:10.1213/ANE.0000000000005364

18. Gaba DM, Fish KJ, Howard SK, Burden AR. Principles of anesthesia crisis resource management. In: Gaba DM, Fish KJ, Howard SK, Burden AR, editors. Crisis Management in Anesthesiology. 2nd ed. Philadelphia: Saunders; 2015:25–53.

19. Swerdlow B, Osborne-Smith L. A cognitive template for management of perioperative adverse events. AANA J. 2023;91(2):137–143.

20. Weidman J, Baker K. The cognitive science of learning: concepts and strategies for the educator and learner. Anes Analg. 2015;121(6):1586–1599. doi:10.1213/ANE.0000000000000890

21. Scordo KA. Differential diagnosis: correctly putting the pieces of the puzzle together. AACN Adv Crit Care. 2014;25(3):230–236. doi:10.1097/NCI.0000000000000035

22. Scott IA, Hubbard RE, Crock C, Campbell T, Perera M. Developing critical thinking skills for delivering optimal care. Int Med J. 2021;51(4):488–493. doi:10.1111/imj.15272

23. Ba-Ali S, Jemec GBE, Sander B, Toft PB, Homoe P, Lund-Andersen H. The effect of two grading systems on the performance of medical students during oral examinations. Dan Med J. 2017;64(3):A5328.

24. Meade PH. A guide to benchmarking; 2007. Available from: https://planning.curtin.edu.au/local/docs/Guide_to_Benchmarking_Oct2007.pdf. Accessed February 6, 2023.

25. Baker K, Sun H, Harman A, Poon KT, Rathmell JP. Clinical performance scores are independently associated with the American Board of Anesthesiology Certification Examination scores. Anes Analg. 2016;122(6):1992–1999. doi:10.1213/ANE.0000000000001288

26. Zhou Y, Sun H, Culley DJ, Young A, Harman AE, Warner DO. Effectiveness of written and oral specialty certification examinations to predict actions against the medical licenses of anesthesiologists. Anesthesiology. 2017;126(6):1171–1179. doi:10.1097/ALN.0000000000001623

Creative Commons License © 2023 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.