Back to Journals » Risk Management and Healthcare Policy » Volume 10

Aviation’s Normal Operations Safety Audit: a safety management and educational tool for health care? Results of a small-scale trial

Authors Bennett SA 

Received 6 January 2017

Accepted for publication 21 April 2017

Published 8 August 2017 Volume 2017:10 Pages 147—165

DOI https://doi.org/10.2147/RMHP.S131763

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Kent Rondeau



Simon A Bennett

Civil Safety and Security Unit, School of Business, University of Leicester, Leicester, UK

Background: A National Health Service (NHS) contingent liability for medical error claims of over £26 billion.
Objectives: To evaluate the safety management and educational benefits of adapting aviation’s Normal Operations Safety Audit (NOSA) to health care.
Methods: In vivo research, a NOSA was performed by medical students at an English NHS Trust. After receiving training from the author, the students spent 6 days gathering data under his supervision.
Results: The data revealed a threat-rich environment, where errors – some consequential – were made (359 threats and 86 errors were recorded over 2 weeks). The students claimed that the exercise improved their observational, investigative, communication, teamworking and other nontechnical skills.
Conclusion: NOSA is potentially an effective safety management and educational tool for health care. It is suggested that 1) the UK General Medical Council mandates that all medical students perform a NOSA in fulfillment of their degree; 2) the participating NHS Trusts be encouraged to act on students’ findings; and 3) the UK Department of Health adopts NOSA as a cornerstone risk assessment and management tool.

Keywords: aviation, safety audit, health care, management benefits, educational benefits

Introduction

The Institute of Medicine1 has argued for a better understanding of the systemic causes of medical error. The Department of Health2 has argued for a wider appreciation of the value of the systems approach in preventing medical error. The National Patient Safety Agency promotes systems-thinking:

The best way of […] reducing error rates is to target the underlying systems failures rather than take action against individual members of staff […]. A much wider appreciation of the value of the systems approach in preventing, analyzing and learning from patient safety incidents [is required].3

Despite these exhortations, medicine’s safety praxis has been little influenced by the systems approach:

While being widely championed in patient safety, where factors related to individuals, technology and the wider organization are afforded equal consideration […] there is […] evidence that the systems approach […] is still underexploited and could be taken much further.4

The National Health Service’s (NHS’s) failure to embrace the systems approach has occurred against the backdrop of a growing contingent liability for medical error:

With potential legal claims, mostly for clinical negligence, now totalling more than £26 billion, the NHS is facing unsustainable liabilities […]. There is no sign of any improvement in reducing the incidence of harm being caused to patients […].5

Against this backdrop of limited progress in the application of systems-thinking and a growing contingent liability, a proactive, systems-thinking-inspired risk management tool – the Normal Operations Safety Audit (NOSA) – was trialed at an English NHS Trust.

Organised by a university medical school, this research had three objectives:

  1. to evaluate the safety benefits of conducting a NOSA in various clinical settings;
  2. to assess whether fifth-year medical students could conduct a NOSA;
  3. to assess the educational benefits for students of conducting an audit.

The project was developed and supervised by the paper’s author (referred to as “Convenor”). The author is certificated to work on the flight-deck. He has spent 1470 h on the jump seat, these hours being accumulated as follows: 232 sectors (a sector being an airport-to-airport flight) on the A319; 66 sectors on the A320; 62 sectors on the A321; 82 sectors on the B737; 181 sectors on the B757 and 7 sectors on the A300. The author has flown gliders and has performed a landing in a 737-300 simulator.

Regarding the research described in this paper, ethical permission was granted by the University of Leicester, England’s University Ethics Sub-Committee for Medicine and Biological Sciences. The agreed project title was “Exploring the value of holistic observations of clinical practice: developing a student observational learning tool”. The observers were full-time medical students who volunteered. The ethics sub-committee did not require the study to obtain observers’ or observees’ consent. This was an integral part of the observers’ medical degree.

Systems-thinking – its meaning and application in aviation

Systems-thinking draws on ethnography, participant observation, action research, oral history and mass observation. To paraphrase Waterson and Catchpole,4 systems-thinking is not so much about applying the “right” type of knowledge to a problem, but about applying the right approach. Systems-thinking is a frame of reference with a simple premise – that human error can be induced. For example, a badly designed display may cause a pilot to misread an instrument.6

Systems-thinking in aviation

Aviation has pioneered the systems-thinking approach to risk management and accident investigation. Watershed moments include Moshansky’s 19927 analysis of the 1989 Dryden accident and Haddon-Cave’s 20098 analysis of the 2006 Nimrod loss. Complex systems – prone to dynamic events such as emergence and practical drift and subject to social, economic and political pressures – are difficult to manage.914 Lagadec15 and Perrow16 associate complexity with vulnerability. Understanding how in reality systems work is the sine qua non of successful system management.

The 1972 Florida Tri-Star disaster17 (101 dead) and the 1977 Tenerife disaster18 (583 dead) convinced the industry that it needed:

  1. a better understanding of routine flight operations and
  2. improved teamworking, both on and around aircraft.

Human factors tools were developed. First, crew resource management improved teamwork and resource utilization.19,20 Second, NOSA documented the reality of flight operations.

NOSA

Recognizing the mutability of the system as designed, NOSA documents the system as found. Systems-thinking tools, such as NOSA, assume system behavior to be an emergent property of complex, hard-to-discern interactions between human and nonhuman components (e.g., personnel, equipment, resourcing, rules, regulations, personal ambition, corporate aspirations and the law). Systems-thinking challenges the false certainties of reductionism.14

Executed by trained observers familiar with flight operations, a NOSA reveals the lived reality – the verité – of flight-deck labor. Observers’ freedom to roam and probe reflects NOSA’s grounding in actor-network theory, specifically Latour’s21 exhortation that researchers must “follow the actors”.

A NOSA is sensitive to phenomena such as practical drift and emergence, where “simple entities, because of their interaction […] can produce far more complex behaviors as a collective […] ”.22 A NOSA describes:

  1. the threat environment (e.g., substandard air traffic control);
  2. the number and type of errors made by flight crew (e.g., intentional noncompliance with a rule);
  3. coping mechanisms:

    [R]outine threats to the safety […] of the system are constantly being managed by the system before they lead to serious outcomes […] this information is often not captured […] by the organization. [NOSA] provides a means by which this can be achieved;23

  4. good practice (e.g., safety innovations introduced by personnel).24

NOSA meets Hollnagel’s25 Safety-II standard. Specifically:

  1. safety management should be proactive;
  2. safety initiatives should be tailored through topographic research;
  3. because of their local knowledge, workers should be at the center of risk management processes.

The NOSA methodology is promoted by the International Civil Aviation Organization.

Methodology

An example of action research, the project tested claims that a NOSA can – by producing a topographic account – help managers understand the lived reality of a labor process. A literature search conducted in early 2016 by the medical school found no references to NOSA in the medical safety literature.

Potential impacts – pedagogic

With reference to theories of immersive/experiential learning26 and action learning,27,28 the project offered students the opportunity to:

  1. conduct in vivo research into a complex, politically charged and difficult to solve problem (patient harm);
  2. be within a bespoke problem-solving team;
  3. use a research instrument that demands of the user solid nontechnical skills;
  4. with the possibility that findings would inform policy and action.

Potentially the project would improve students’ teamworking, observational, communication and problem-solving skills.

Potential impacts – organizational

Denscombe29 observed, “Early on, action research was […] seen as research specifically geared to changing matters […] this has remained a core feature […] ”. Lewin30 characterized action research as a “spiral of steps […] composed of a circle of planning, action and fact-finding about the result of the action”.

Drawing on Lewin’s30 and Denscombe’s29 formulation, the Convenor intended the data to provoke change within the research setting (the Trust). To this end, he:

  1. organized a feedback session for participating medical practitioners and academics;
  2. wrote a journal paper;
  3. sought funding for a larger project.

Research instrument

Unlike an aircraft flight-deck, a medical facility (e.g., an accident and emergency department) is a permeable workspace open to actors with varied roles (e.g., doctors, nurses, ambulance crew, porters, cleaners, police officers). Consequently, the standard University of Texas Human Factors Research Project NOSA Threat and Error Management Worksheet31 was simplified to create the more functional Threat and Error Assessment and Management Worksheet (TEAM-W; Figure 1).

Figure 1 Blank Threat and Error Assessment and Management Worksheet.

The TEAM-W coding system (Table 1) was developed by a clinician with a working knowledge of NOSA. It is reproduced in the “Quantitative analysis” section.

Table 1 Threat and error codes

Notes: *The Convenor judged that threat and error subcodes scored 10 or more times merited case studies.

Reflections on the methodology

Reflection32 revealed potential pitfalls:

  1. The data could be skewed by the Hawthorne effect: persons subject to observation may modify their behavior.33
  2. The data could be skewed by experimenter bias: observees’ identification with the observer may cause them to modify their behavior.
  3. The data could be skewed by observer bias: observers’ preconceptions may influence the choice of scenario and interpretation of same.
  4. The data could be skewed by TEAM-W’s coding structure: to a degree, coding systems focus researchers’ attention.
  5. Observer cognitive overload could cause data to be misinterpreted or lost: information overload and prioritization errors can result in observer task saturation, reducing situation awareness.34 Medical settings can appear chaotic.
  6. Knowledge deficit could reduce accuracy: student-observers might lack the knowledge and experience required to make accurate observations.
  7. TEAM-W could be dismissed as derivative: “[R]esearch is never conducted without reference to other studies”.35

Research team

The researchers were university medical school fifth-year students due to progress to foundation training in 2017. In the United Kingdom, medical graduates who elect to work in the National Health Service spend up to nine years as a Junior Doctor. Two years are spent as a Junior Doctor foundation trainee, then either three years as a GP trainee, or seven years as a Hospital Speciality trainee.37

Students volunteered to join the team in fulfillment of one of the student selected components (SSCs) of their degree (The General Medical Council mandates that SSCs must constitute a minimum of 10% of course time. SSCs allow students to demonstrate mandatory competencies.). The project commenced with a day’s introduction to TEAM-W. The training mixed a video presentation the award-winning educational video “Recognizing risk and improving patient safety – Mildred’s Story”38 re-presented through a NOSA lens with PowerPoint presentations and a question-and-answer session.

The 11 volunteers (7 females, 4 males) were divided into five mixed-gender groups (four groups of 2 students and one group of 3). They were given a timetable of appointments with clinicians and told that they would return for an interim wash-up (debrief) at the end of Week 1 and a final wash-up at the end of Week 2. They were told that:

  1. each group would make a case study-based PowerPoint presentation in the final wash-up;
  2. each student would complete a Competence Log Book (to be signed off by the Convenor), as shown in Figure S1 (only the first two pages are shown);
  3. each student would complete their own TEAM-Ws, to be given to the Convenor at the final wash-up in either electronic or hard-copy form.

During his visits with the various teams, the Convenor made notes, some of which he later transcribed onto TEAM-Ws.

Fieldwork

Ethical permission was granted by the NHS about a month before the project commenced. Although the NHS permit did not require that participants’ informed consent be secured, on request, observers discussed the purpose of their observations/questions with the observees/interviewees. The five groups rotated through a number of NHS clinical settings over a period of 2 weeks during summer. Settings included a mental health facility, a fracture clinic (acute), a urology ward and general and vascular surgery theaters. The Student Roster is reproduced in Table S1.

At liberty to talk to anyone (e.g., staff, patients, relatives) and observe any intervention or procedure, the students were able to describe:

  1. subjects’ actions and
  2. systemic influences on behavior.

To facilitate completion of TEAM-Ws, the students were given clipboards. Appropriate ID was displayed. The days could be long and busy, with few breaks. The Convenor rotated between medical settings offering support and making his own TEAM-W notes (although not incorporated into the final data set, one observation is reproduced in Table S2 and Figure S2). There were few issues. Most staff members were receptive (despite occasionally not having been told about research). The major difficulty was locating staff and students in sprawling facilities. A representative sample of threats and errors recorded ten or more times by the students is produced in Table 2. The narratives are the students’ own words.

Table 2 Qualitative analysis (case studies)

Abbreviation: F1, Charge Nurse; PPE, personal protective equipment; NHS, (UK) National Health Service; HCA, Health care assistant.

Presentations and sign-off

On the final day of the SSC, each group made a presentation. Although invitations were sent to clinicians involved in the research, none attended. The Convenor judged the presentations to be of exceptional quality. The case studies revealed numerous issues. Students reflected on NOSA’s suitability to health care. Students’ reflections are listed in Box S1. The Convenor judged that all students had passed the SSC. Competence Log Books were signed-off.

Student and convenor insights into the methodology

These are as described in Box S1. Generally, the students encountered few difficulties, although the staff could occasionally be suspicious of the study. Students attributed the majority of adverse reactions to staff not being told in advance. Without exception, students claimed that the study had improved their awareness of patient safety issues. They also claimed it had improved their situation awareness, observation and communication skills and self-confidence.

Although students considered the coding system a good first attempt, all stated that it required further development. Use of a paper-based recording system (TEAM-W) made record-keeping and data analysis laborious, time-consuming and demanding. All felt that consideration should be given to developing an electronic TEAM-W for a modern digital platform, such as an iPad. In the opinion of the Convenor, this would significantly reduce the time required to mine the data for trends and patterns.

Ideally, group assessments of common settings would have been cross-checked for consistency. A number of factors made cross-checking difficult, most notably, the fact that the Convenor’s budget did not include monies for the development of an electronic TEAM-W (that, as discussed, could have been stored on a portable electronic device such as an iPad). The use of a hard-copy TEAM-W significantly complicated data analysis (including cross-checking). The large volume of data collected from multiple sites and scenarios meant it was impossible to cross-check within the time available. There were no funds allocated for the post-trial creation of an electronic database. Monies are being sought for a second field trial, supported by an iPad-hosted TEAM-W developed in consultation with the School of Medicine and NHS.

Data

The student observers (henceforth referred to as observers) generated a large volume of data. Subjected to quantitative (frequency) and qualitative (case study) analysis, the data showed health care to be a threat-rich environment with errors (some consequential) a commonplace.

Quantitative analysis

Where no suitable subcode could be found, the observers scored against the main code. For example, regarding threats that were judged human in origin, 5 were scored against “100 Human in origin”. Regarding threats that were judged technological in origin, 20 were scored against “200 Technological in origin” (Table 1).

Analysis

Pedagogic impact

The School of Medicine summarized student feedback in a pie chart (Figure 2). Although operationalizing NOSA as an educational tool in live settings was not unproblematic (refer observers’ comments in Box S1), observers’ feedback on the educational benefits of the SSC was overwhelmingly positive. Regarding “new skills gained”, typical Competence Log Book comments were:

  • improved observational skills. One observer wrote: “I have developed my skills in observing a scenario from more of an objective view, whereas previously I watched just to gain clinical knowledge”
  • improved listening skills;
  • improved communication skills. One observer wrote: “I had to explain the project and make sure people did not feel threatened by us”;
  • ability to empathize;
  • better at maintaining situational awareness;
  • ability to critique medical practice;
  • tenacity (desire to “get to the bottom of things”);
  • better at uncovering the truth of a situation;
  • better at record-keeping;
  • better at putting staff at their ease;
  • methodical skepticism;
  • objectivity;
  • ability to “stand apart” (disinterestedness);
  • capacity for reflection;
  • diplomacy;
  • confidence. One observer wrote: “[I have learned] how to question NHS staff [...] about the system [...] without fearing intimidation, or being criticised for [asking questions]”;
  • restraint (the importance for safety of not blaming).

Figure 2 Student feedback pie chart.

Abbreviation: SSC, student selected component.

Insights into the human factors aspects of health care provision

The following claims should be considered against the potential biases described in the “Reflections on the methodology” section:

  1. The elements of health care provision observed during the study presented a threat-rich environment.
  2. Errors, some consequential, were made by medical professionals.
  3. Systems theory posits that the origins of error are complex. Frequently, they are the product of individual and organizational failings: “[H]uman mistakes […] rarely have a single underlying contributory factor. Error is the product of design, procedures, training and/or the environment”.20 This study confirms the systems theory view of error: some errors resulted from willful neglect. For example, the persistent failure to gel (sterilize) hands. One observer wrote, “[C]onsultant washed hands total of 5 times for approx 30 patients – juniors didn’t wash hands at all”; “Hands not gelled by doctors throughout the ward round”; “HCA touched bin and then touched patient without washing hands […] HCA did not sanitize hands during time on ward and touched 3 further patients”.

    Others, such as the guideline-flouting night-time transfer from ICU of a patient, were induced by circumstance (a bed shortage).

  4. Regarding basic safety procedures such as hand sterilization, there appeared to be a subculture among consultants and doctors of ignoring advice (e.g., signage reminding staff to gel). Referencing Hatch’s42 work, Bennett and Stewart43 observed: “Organizational culture is seldom monolithic. Organizations often consist of numerous sub-cultures, constituted in part through workers’ shared interests, beliefs, skills and profession”. Subcultures produce “inconsistencies”.44 Several questions occur. For example:
    • Why did some consultants and doctors ignore hygiene guidelines?
    • To what degree does consultants’ behavior influence doctors’ behavior (e.g., in regard to hygiene)? Armstrong44 noted how an organization’s culture is shaped by senior management. Consultants’ behavior could be normative. Janis45 notes a proclivity for “concurrence-seeking behavior” in tight-knit groups. He claims the members of such groups are subject to “conformity pressures” and that group behavior may exhibit “derangement”.
    • Is it reasonable to conclude that health care fosters subcultures that harbor deviant behaviors?
  5. NHS England46 defines never-events as

    [S]erious incidents that are wholly preventable […]. Each Never Event […] has the potential to cause serious patient harm or death. However, serious harm or death is not required to have happened […] for [an] incident to be categorized as a Never Event.

Observers recorded five never events (code 1201). As the nomenclature implies, a never event has the potential to harm or kill. Never events include:

  • wrong-site surgery;
  • retained foreign object postprocedure;
  • wrong-route administration of medication;
  • scalding of patients.47

The data should be considered against a background of public concern about patient safety.

Conclusion

Regarding the project’s first objective (see Introduction section), the safety benefits of conducting a NOSA in health care include:

  • identification of bad and good practice;
  • reasons for work-arounds (expedients);
  • through the production of thick description, documenting the lived reality of medical labor;
  • provision of information in support of informed policymaking by the government and trusts.

Regarding the project’s second objective, under the mentorship of the Convenor and clinicians (Table S1), the fifth-year students completed a large number of TEAM-Ws to a high standard. The forthright nature of the comments evidenced a lack of inhibition (suggesting confidence in the methodology and a desire to contribute). The Convenor’s promise that data would be anonymized helped secure observers’ and observees’ commitment. Anonymous reporting within a just culture encourages flight crew commitment to NOSA.23 Regarding the project’s third objective, a School of Medicine survey confirmed the project’s educational benefits.

Of course, the above claims should be considered against the possible research biases discussed in the “Reflections on the methodology” section. To reprise one potential source of bias, it is always possible that observees “performed” for the observers. Given the difficulty of quantifying the Hawthorne effect, data, inferences and conclusions should be tested. It is also possible that observers cloaked failings by indulging their own research interests (e.g., by focusing on a narrow range of clinical care issues). As with knowledge generated in the natural sciences, knowledge generated in the social sciences is potentially refutable.

The results of the study suggest three policy developments.

First, the UK General Medical Council should mandate that all medical students perform a NOSA in fulfillment of their degree. As discussed, the 11 students claimed to have benefited in various ways from their participation in the trial. Three said they would like to be involved in further patient safety research.

Second, participating NHS Trusts should be encouraged to act on students’ NOSA findings. NOSA-derived insights should be considered a useful supplement to insights derived from established patient safety systems (such as in-house confidential error reporting systems).

Third, in light of the continuing high level of avoidable deaths in NHS hospitals, the UK Department of Health should adopt NOSA as a cornerstone risk assessment and management tool. It is hypothesized that, other things being equal, groups are capable of producing superior analyses than individuals working alone. Shaw48 observes:

This effect can be accounted for by the increased number of judgments in the group […] the wider range of knowledge in the group […] and the influence of the more confident (and more accurate) individuals in the group […].

The NOSA tool should be used by mixed teams. NOSA teams should possess a range of expertise and experience. A team might consist of a radiologist, junior doctor, nurse, paramedic and pharmacist.

NOSAs should be notified to all staff in good time, comprehensively planned and adequately resourced. NHS Trusts must ensure that funds are available to implement suggested remediations. Failure to implement remediations will undermine confidence in NOSA, reduce safety audit buy-in and, possibly, undermine staff commitment to the organization’s goals. Trust is a fragile resource. Taking a long time to build, it can be destroyed in an instant.49

Acknowledgments

The Convenor would like to thank the senior academics and clinicians who funded and supported this trial. Special thanks go to the senior clinician who helped develop the TEAM-W coding system and to the 11 fifth-year medical students who trialed TEAM-W. These talented and dedicated young people are a credit to their university and the NHS.

Disclosure

The author reports no conflicts of interest in this work.

References

1.

Institute of Medicine. To Err is Human: Building a Safer Health System. Washington DC: National Academy Press; 2000.

2.

Department of Health. An Organization with a Memory. London: The Stationery Office; 2000.

3.

National Patient Safety Agency. Seven Steps to Patient Safety. London: The National Patient Safety Agency; 2004.

4.

Waterson P, Catchpole K. Human factors in healthcare: welcome progress, but still scratching the surface. BMJ Qul Saf. 2016;25(7):480–484.

5.

Stittle J. In critical condition. icsa The Governance Institute. Available from: https://www.icsa.org.uk/knowledge/governance-and-compliance/analysis/news-analysis-in-critical-condition. Accessed October 1, 2016.

6.

Bennett SA. Disasters and mishaps: the merits of taking a global view. In: Masys A, editor. Disaster Forensics: Understanding Root Cause and Complex Causality. Cham: Springer; 2016:151–174.

7.

Moshansky VP. Moshansky, Commission of Inquiry into the Air Ontario Accident at Dryden, Ontario: Final Report (Volumes 1–4). Ottawa: Minister of Supply and Services; 1992.

8.

Haddon-Cave C. The Nimrod Review. An Independent Review into the Broader Issues Surrounding the Loss of the RAF Nimrod MR2 Aircraft XV230 in Afghanistan in 2006. HC 1025. London: Her Majesty’s Stationery Office; 2009.

9.

Weir DTH. Risk and disaster: the role of communications breakdown in plane crashes and business failure. In: Hood C, Jones DKC, editors. Accident and Design. London: UCL Press; 1996:114–126.

10.

Snook S. Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq. New Jersey, NJ: Princeton University Press; 2000.

11.

Hollnagel E. Barriers and Accident Prevention. Aldershot: Ashgate Publishing Ltd; 2004.

12.

Dekker S. Resilience engineering: chronicling the emergence of confused consensus. In: Hollnagel E, Woods DD, Leveson N, editors. Resilience Engineering: Concepts and Precepts. Aldershot: Ashgate Publishing Ltd; 2006:77–92.

13.

Reason J. A Life in Error. Aldershot: Ashgate Publishing Ltd; 2013.

14.

Shorrock S, Leonhardt J, Licu T, Peters C. Systems Thinking for Safety: Ten Principles. Brussels: Eurocontrol; 2014.

15.

Lagadec P. Ounce of prevention worth a pound in cure. Management Consultancy; 1993.

16.

Perrow C. Normal Accidents. New York, NY: Basic Books; 1984.

17.

Brookes A. Flights to Disaster. Shepperton: Ian Allan; 1996.

18.

Weick KE. The vulnerable system: An analysis of the Tenerife Air Disaster. J Manage. 1990;16(3):571–593.

19.

Bennett SA. Human factors for maintenance engineers and others – a prerequisite for success. In: Blockley R, Shyy W, editors. Encyclopedia of Aerospace Engineering. Chichester: Wiley; 2010:4703–4710.

20.

Harris D. Improving aircraft safety. The Psychologist. 2014;27(2):90–94.

21.

Latour B. Reassembling the Social: An Introduction to Actor-Network Theory. Oxford: Oxford University Press; 2005.

22.

Woods DD, Dekker S, Cook R, Johannsen L, Sarter N. Behind Human Error. Aldershot: Ashgate Publishing Ltd; 2010.

23.

Eurocontrol. Normal Operations Safety Survey (NOSS) [webpage on the Internet]. Available from: http://www.eurocontrol.int/articles/normaloperationssafetysurveynoss. Accessed October 15, 2016.

24.

International Civil Aviation Organization. Line Operations Safety Audit. Montreal: International Civil Aviation Organization; 2002.

25.

Hollnagel E. Safety-I and Safety-II. The Past and Future of Safety Management. Aldershot: Ashgate Publishing Ltd; 2014.

26.

Kolb D. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice Hall; 1984.

27.

Revans R. Action Learning: New Techniques for Management. London: Blond & Briggs Ltd; 1980.

28.

Leonard HS, Marquardt MJ. The evidence for the effectiveness of action learning. Action Learning: Res Pract. 2010;7(2):121–136.

29.

Denscombe M. The Good Research Guide: For Smallscale Social Research Projects. London: McGrawHill Education; 2014.

30.

Lewin K. Action research and minority problems. J Soc. 1946;2(4):34–46.

31.

International Civil Aviation Organisation. Line Operations Safety Audit. Montreal: International Civil Aviation Organisation; 2002.

32.

Schön D. The Reflective Practitioner, How Professionals Think in Action. New York, NY: Basic Books; 1983.

33.

Taylor P, Richardson J, Yeo A, Marsh I, Trobe K, Pilkington A. Sociology in Focus. Ormskirk: Causeway Press; 1995.

34.

Gordon S, Mendenhall P, O’Connor BB. Beyond the Checklist. What else Healthcare can Learn from Aviation Teamwork and Safety. Ithaca: ILR Press; 2013.

35.

Gilbert N Stoneman P. Researching Social Life. 4th ed. London: Sage; 2016.

36.

Triggle N. Junior doctors’ strike: all-out stoppage ‘a bleak day’. Available from: http://www.bbc.com/news/health-36134103. Accessed October 10, 2016.

37.

Sculthorpe, T and Pickles, K Jeremy Hunt pleads with junior doctors to cancel the strike. The Daily Mail. 2016. http://www.dailymail.co.uk/news/article-3440125. Accessed July 1, 2017.

38.

Allsop P, Overton S, Stewart N, Stewart P. Recognizing Risk and Improving Patient Safety – Mildred’s Story. Leicester: University of Leicester Audio Visual Services; 2010.

39.

Dunhill L. Trusts heading for £670 million deficit, says Regulator. NHS, Health Services Journal. 2016. Available from https://www.hsj.co.uk/topics/finance-and-efficiency/trusts-heading-for-670m-end-of-year-deficit-says-regulator/7013453.article?blocktitle=News-(grid)&contentID=20682. Accessed July 14, 2017.

40.

Vaughan D. The Challenger Launch Decision. Risky Technology, Culture and Deviance at NASA. Chicago: University of Chicago Press; 1997.

41.

Rasmussen J. Risk management in a dynamic society: a modeling problem. Safety Sci. 1997;27(2–3):183–213.

42.

Hatch MJ. Organization Theory: Modern, Symbolic and Postmodern Perspectives. Oxford: Oxford University Press; 1997.

43.

Bennett SA, Stewart N. Employees’ experience of, and attitudes towards team working at a National Health Service (NHS) District General Hospital. Int J Risk Manag. 2007;9(3):145–166.

44.

Armstrong M. A Handbook of Personnel Management Practice. London: Kogan Page; 1996.

45.

Janis IL. Victims of Groupthink. Boston: Houghton Mifflin Company; 1972.

46.

NHS England. Revised Never Events Policy and Framework. 2015. Available from https://improvement.nhs.uk/uploads/documents/never-evnts-pol-framwrk.pdf. Accessed July 14, 2017.

47.

NHS England. Never Events List 2015/16. 2015. Available from https://www.england.nhs.uk/wp-content/uploads/2015/03/never-evnts-list-15-16.pdf. Accessed July 14, 2017.

48.

Shaw ME. Group Dynamics. The Psychology of Small Group Behavior. New York, NY: McGraw-Hill Book Company; 1971.

49.

Lerangis P. The Sword Thief. New York, NY: Scholastic Corporation; 2009.

Supplementary materials

Figure S1 Competence Log Book.

Table S1 Student roster.

Note: The students were divided into 5 groups: A, B, C, D, E.

Abbreviations: HCA, Health-care assistant; ESWL, Extracorporeal shock-wave lithotripsy; MDT, Multi-disciplinary team; TRUS, trans-rectal ultrasound.

Table S2 Convenor-generated observational data Afternoon urology clinic in a large city hospital.

Note: No scheduled breaks for staff (although some were able to take an informal break). Timings for early part of clinic.

Figure S2 The clinic’s layout.

Box S1

Creative Commons License © 2017 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.