Flight time: what practitioners can learn from pilots
Lessons from the evolution of human factors in aviation can be applied in a healthcare setting
Would you behave differently if you shared the fate of your patient arising from any error you made at work?” Professor Peter Brennan, consultant maxillofacial surgeon at Portsmouth Hospital NHS Trust, posed the question recently, pointing out that this is the position pilots find themselves in as a matter of norm.
“If they make a mistake, everyone dies,” he said. “If I make a mistake, I walk away from the experience.”
Professor Brennan (@BrennanSurgeon on Twitter) said that his eyes were opened to the phenomenon of ‘Human Factors’ in aviation – optimising the relationship between people and systems in order to improve safety and performance – around nine years ago. The friend of a patient he was treating happened to be a British Airways 747 Training Captain. “I got chatting to him,” recalled Brennan, “and he came to theatre and really opened my eyes to better team working, effective communication, reducing hierarchy, and workload management.”
Professor Brennan was speaking last November at a conference on Human Factors, organised by the Royal College of Physicians and Surgeons of Glasgow. He highlighted the importance of apparently simple things such as being properly hydrated and fed, and interpersonal relationships, as well as more challenging concepts such as “flattening hierarchies”, and how to ensure the adoption of a “no-blame culture”.
Taking a lead from other high-risk organisations, including aviation and air traffic services, the conference examined human factors and their relevance to errors in practice. In the run-up to the conference, Dr Richard Hull, the college’s Honorary Secretary, who co-organised the conference with Professor Brennan, outlined to Ireland’s Dental the thinking behind hosting the event.
“‘Never events’ are simply that; they should never occur. An audit of never events in Wessex showed that human factors were implicated in more than 80 per cent of cases reported. Since the Kegworth air disaster 30 years ago, where human factors occurred resulting in the deaths of 47 people, airlines and other high-risk organisations have embraced the relevance of human factors.
“Since Kegworth, there has not been a single death due to human error on a UK registered commercial airline in more than three billion passenger journeys. While the NHS environment is very different, we have much to learn to promote safe working, in a no-blame culture, to ultimately give better, safer, healthcare for our patients.”
The aim of the conference was to help people working in health care, including dental professionals, how to recognise the relevance of human factors in their day-to-day practice and performance. It was important, said Professor Hull, for people to understand the specific features of errors and the scale of the problem. He added: “Errors are everyone’s problems and we need to do the maximum to prevent them.”
Medical errors are common and largely preventable, the conference heard. In the UK, one in 10 hospital admissions has some form of human error – ranging from relatively minor incidents to so-called ‘never events’ and death – estimated to be up to 5,000 patients per year. Analysis of never events has found that Human Factors are responsible for the majority of these mistakes.
Professor Brennan underlined the view that healthcare cannot be compared exactly with aviation, “but we can use the many Human Factors that aviation and other high-risk organisations know so well; enhancing team working, effective communication, workload management, reducing hierarchy and professionalism among others.
“If our work on Human Factors prevents serious error for just one patient, then we have succeeded. We are gaining recognition internationally and helping to promote our speciality as a leader in this area.” He said that most errors start at the organisational level and end with the unsafe act itself. “Most of my work has been looking at the preconditions; if you can block those conditions, you can almost certainly prevent the error from occurring.” He added: “High-risk organisations – aviation, rail, nuclear energy, national air traffic services – they recognise the importance of human factors. The only way to embed Human Factors across healthcare is that top-down, bottom-up approach, so that we meet in the middle. There’s a wealth of evidence to show that senior management is core; not just in practice, but also the regulator, and the Colleges.”
Professor Brennan showed a slide of a man he had operated on; the right side of his head had been penetrated by the blade of an angle saw he had been using to cut tiles in a shipyard. The preconditions, said Professor Brennan, were that the man was new to the job, he was unsupervised, and he had been set a time limit to complete the task. The unsafe act – the error – was that he pressed down too hard. The blade sheared off, went through the visor he was wearing and sliced into his face below his eye socket.
“A simple mistake, that should never have happened,” recalled Professor Brennan, “It was a seven-hour operation, involving bone grafts to rebuild the orbit.” The outcome was positive, he said. “His vision was fine. I got that result because every two hours, I walked away for a 10 or 15-minute break. And actually, you finish quicker than if you work for seven or eight hours because your performance falls with time.”
Captain Niall Downey (@nialldowney on Twitter), a pilot with Aer Lingus, described himself as a “recovering thoracic surgeon”, after having switched careers in the nineties, from medicine to aviation. At the beginning of his presentation, he asked delegates: “Has anyone here ever made a mistake?” A delegate answered: “Every day.” Captain Downey responded: “So, we’re in the right room. In aviation, we assume we are going to make mistakes, and our whole mindset and system is based around that.” Looking back to his time in cardiac surgery, it was different: “We weren’t allowed to make mistakes. If you did make mistakes, you weren’t allowed to talk about it; I think there is a better way.”
To underline the urgency of his message, Captain Downey reviewed studies of deaths caused by human error in healthcare systems, some of which put the figure much higher than 5,000 per year. Extrapolated, he said, while showing a slide of the passenger cabin of a 174-seat Airbus A320: “Each one of those seats is a funeral in the healthcare system due to human error. Every 10 days, we crash one of those. It doesn’t get covered by the BBC and we don’t have to tell the CAA (Civil Aviation Authority). That’s your environment. We changed our environment over the last 40 years.” Charting accidents and incidents, and the number of deaths, in aviation from 1920, he said there was a steady climb to 1977, and then a descent to a point now where there are fewer than 1,000 deaths a year per year in commercial aviation worldwide, out of around four billion passenger movements. It was in 1977 that two Boeing 747 passenger jets collided on the runway at Tenerife airport, killing 583 people.
“That was a watershed moment in aviation,” said Captain Downey. “We decided as an industry, we needed to do things differently. It began as ‘cockpit resource management’, became ‘crew resource management’, and has evolved over the past 40 years into full-blown Human Factors.”
Captain Downey said that there should not be a focus on a ‘no-blame culture’. He said: “We don’t have a no-blame culture. If I make a bollocks of something tomorrow, I will be blamed, I will be held responsible. But if I report it, I won’t be sacked for it.” There is a ‘Just Culture’, which, he said, means “honest human mistakes, not deliberate error or gross negligence, but it means we can make mistakes and admit to them.” In contrast, he said, in healthcare there existed a “name, shame, and reclaim” culture. In aviation, he said, when an error is admitted they look at the system to uncover the ‘tripwire’ that led to the error and “we then try to engineer the tripwire out of the system and replace it with a safety net.”
Aviation looks at crew resource management; communication, leadership, situational awareness, workload management. “Just Culture. Systems. Crew Resource Management. That’s our basic three–stage system,” said Captain Downey, “and that’s the system that we are trying to get across to you guys. You can’t just transplant it in, but the underlying DNA is good. We can genetically engineer it for your environment.” That process of “genetically engineering” aviation’s three-stage system for a healthcare setting is something which Captain Graham Shaw, a senior First Officer Training Pilot for British Airways, and Captain Chris Holden, a flight instructor with British Airways, have undertaken within the NHS. Captain Holden looked back to the early days of NASA when it was found that high-performing individuals did not work well together in teams. Tackling that problem has evolved today into what is termed an ‘integrated competency-based structure’ where there is no separation between technical and non-technical competencies.
“It is one skill set,” said Captain Holden. “You can use the competencies on a personal level, see your own strengths, and apply them to a team. There are technical skills – clinical knowledge and procedural conduct – and social skills – professionalism, communication, leadership, and teamwork. They should be evidence-based and observable. You can also track data. It’s about creating a bespoke version of competence for your own healthcare environment, but in principal they are broadly similar to any high-performing team.”
“In aviation, we assume we are going to make mistakes, and our whole mindset and system is based around that”CAPTAIN NIALL DOWNEY, AER LINGUS PILOT
Captain Shaw said that the process of embedding this system in healthcare can face barriers. Systemic barriers include regulation, a lack of ring-fenced funding to support training, a perceived lack of relevance and a lack of an open culture. Individual barriers include a lack of clarity on how to implement and a lack of training.
“That’s where we come in; to help people recognise great behaviours, get teams to work together so that those behaviours spread throughout the organisation,” said Captain Shaw. “We can’t fix all those [barriers] while on the day job, but we can give ourselves the skills and knowledge to understand problems, to build an effective and empowered team, with everyone in the room working together to support each other, to use human factors as a final layer of defence when other protection layers in the system fail.” Captain Shaw stressed: “Protect the patient, look after each other. It’s the fundamental point.”
Personal factors that threaten safety
- Lack of communication
Failure to transmit, receive, or provide enough information to complete a task. Never assume anything. Only 30 per cent of verbal communication is received and understood by either side in a conversation. Others usually remember the first and last part of what you say. Improve your communication:
- Say the most important things in the beginning and repeat them at the end
- Use checklists.
Overconfidence from repeated experience performing a task. Avoid the tendency to see what you expect to see:
- Expect to find errors
- Don’t sign it if you didn’t do it
- Use checklists
- Learn from the mistakes of others.
3. Lack of knowledge
Shortage of training, information, and/or the ability to successfully perform. Don’t guess, know:
- Use current manuals
- Ask when you don’t know
- Participate in training.
Anything that draws your attention away from the task at hand. Distractions are the number one cause of forgetting things, including what has or has not been done in a task.
Get back in the groove after a distraction:
- Use checklists
- Go back three steps when restarting the work.
5. Lack of teamwork
Failure to work together to complete a shared goal. Build solid teamwork:
- Discuss how a task should be done
- Make sure everyone understands and agrees
- Trust your teammates.
Physical or mental exhaustion threatening work performance. Eliminate fatigue-related performance issues:
- Watch for symptoms of fatigue in yourself and others
- Have others check your work.
7. Lack of resources
Not having enough people, equipment, documentation, time, parts, etc., to complete a task. Improve supply and support.
Real or perceived forces demanding high-level job performance. Reduce the burden of physical or mental distress:
- Communicate concerns
- Ask for extra help
- Put safety first.
9. Lack of assertiveness
Failure to speak up or document concerns about instructions, orders, or the actions of others. Express your feelings, opinions, beliefs, and needs in a positive, productive manner:
- Express concerns but offer positive solutions
- Resolve one issue before addressing another.
A physical, chemical, or emotional factor that causes physical or mental tension. Manage stress before it affects your work:
- Take a rational approach to problem-solving
- Take a short break when needed
- Discuss the problem with someone who can help.
11. Lack of awareness
Failure to recognise a situation, understand what it is, and predict the possible results. See the whole picture:
- Make sure there are no conflicts with an existing procedure
- Fully understand the steps needed to complete a task.
Expected, yet unwritten, rules of behaviour. Help maintain a positive environment with your good attitude and work habits:
- Existing norms don’t make procedures right
- Follow good safety procedures
- Identify and eliminate negative norms.
- Brennan PA, Davidson M. Improving patient safety: we need to reduce hierarchy and empower junior doctors to speak up. BMJ. 2019; 366: l4461
- Brennan PA, Oeppen R, Knighton J, Davidson M. Looking after ourselves at work: the importance of being hydrated and fed. BMJ. 2019; 364: l528