Chapter 1 Come Fly With Me! I am thirty miles south of London's Gatwick Airport, the world's busiest single-runway airport, when one of the seven flight control computers fails in my Airbus A320 aircraft. The plane politely 'bings' and flashes an unthreatening amber light to alert me to this fact. I co-ordinate with my co-pilot to ensure the safety of the flight is assured, and that the plane is performing as expected under the circumstances. We check if there are any relevant checklists to perform. There aren't. Reassured, I push a button to acknowledge via our computer interface on the ECAM control panel that I'm aware of the failure and then again to acknowledge that I am aware of the status of our aircraft systems. And that's pretty much it! This could have been a big problem, but thankfully, the flight control computer's error, whatever it was, has minimal impact. Our $100 million airplane is designed around the concept of redundancy. We expect that things will go wrong, so we have back-ups for more or less everything. If a computer or a system fails, its back-up takes over with little or no fuss. Our A320 is a Fly-By-Wire (FBW) aeroplane, which means that our controls (side-stick, rudder pedals, thrust levers and so on) are not physically connected to the flight controls, but via multiple levels of computing power which allows some pretty nifty programming to smooth out my inputs making me look better than I actually am. It also provides protections to stop me exceeding the limits of the aeroplane, for instance banking or pitching beyond pre-determined limits, flying too fast or too slow and so on. It also saves weight by removing quite a few cables, pulleys and levers which were previously needed to link us to the control surfaces. This means we save on fuel, and thereby our plane is more economical. There are seven flight control computers, namely, two Elevator and Aileron Computers (ELAC 1 & 2), three Spoiler and Elevator Computers (SEC 1, 2 & 3) and two Flight Augmentation Computers (FAC 1 & 2). In a reassuringly paranoid mindset, each computer in a set is supplied by a different vendor, uses software provided by a different vendor and each processor is even programmed in a different computer language, all to minimise the chance of everything failing at once. These sophisticated bits of kit mean that the aeroplane operates in what Airbus calls Normal Law most of the time. This is designed to approximate what a conventional aeroplane feels like to fly, although fewer and fewer of us are getting the opportunity to fly one of those, as many pilots start their careers in a modern FBW aircraft and never revert to anything else. We take off in Ground Mode which gently transitions into Flight Mode a few seconds after getting airborne. On final approach to landing, it moves into Flare Mode as we approach touchdown, again to make it feel more like a conventional aeroplane, and so it behaves as our brains expect machinery to behave. As noted, it protects me from over-speeding, under-speeding, stalling, excessive g-loads, pitch and bank angles. It adjusts how it interprets my input according to our speed, altitude and so on. Overall, it's an incredible bit of kit, although it's not fool-proof. The failure of our ELAC 1 has minimal impact on our day except that we have lost a layer of redundancy. This becomes of more interest to us when we hear a second bing shortly afterwards to let us know that our second computer, ELAC 2, has come out in sympathy with its friend. We go through the same procedure again and assess that we are still in good shape except that the plane has now degraded into what is called Alternate Law. This is similar to Normal Law but with fewer protections. We retain our g-load protection but lose our bank angle and pitch protections. Our low and high-speed protections are not as comprehensive but we have some support. This is a slightly bigger deal; it means we have to become more alert, but the aeroplane still flies normally. Unfortunately, our day then gets progressively worse. The plane informs us with increasing levels of urgency (continuous high pitched chimes and red flashing lights) that further flight control computers have dropped out, leaving us in Direct Law, which essentially turns the plane into a normal, conventional aeroplane with all our protections lost and no autopilot to help me fly. As a final insult, we lose all electrical power which drops us into the lowest available flight mode, Mechanical Back-up. This leaves us with only two, fairly crude connections to our flight controls, namely our trim wheel in the centre pedestal which moves the elevator on the tail-plane allowing us some control to point the plane up or down, and the rudder pedals, again connecting us to the tail of the aeroplane and the rudder giving us some left/right turn control. This is designed to enable us to fly roughly straight and level to buy enough time to get at least one computer re-booted and give us enough control to land the plane safely. Our engines are still working but without our Autothrust mode (our cruise control, if you like). These three inputs are all we have left. It's a bad day at the office, but it could have been a lot worse. Aviation's attitude to error has provided us with many layers of protection to allow us to navigate our way safely back from multiple failures. In this book I will explore these and show how, with minimal modification, they can be used to achieve a similar goal in both our professional and personal lives, regardless of what field we work in or our circumstances. *** I've always been fascinated by mistakes. In the late 1970s my brother gave me a book, The Book of Heroic Failures by Stephen Pile, the President of the Not Terribly Good Club of Great Britain. It opened my eyes to such glorious errors as the prisoners in Saltillo Prison in Mexico who spent five months digging a tunnel in an audacious escape plan only to find upon surfacing that it led into the nearby courtroom where many of them had been sentenced - all seventy-five were swiftly returned to prison. Or the equally impressive error Mrs Beatrice Park made by mistaking the accelerator for the clutch during her fifth attempt at her driving test in 1969, which resulted in her and her examiner sitting on the roof of the car in the middle of the River Wey in Guildford waiting to be rescued. The examiner had to be sent home and when Mrs Park enquired whether she had passed she was told, 'We cannot say until we have seen the examiner's report.' It also exposed truly brilliant errors like Decca Records (as well as Pye, Columbia and HMV, in fairness to Decca) who turned down The Beatles with the now legendary quote, 'We don't like their sound, groups of guitars are on the way out'. This great tradition has been carried on by the twelve publishing houses who turned down J.K. Rowling's book about a young wizard named Harry Potter in the 1990s. Error is not simply a historical curiosity. It's alive and well. My interest in error dwindled as I progressed through my education as the emphasis was on the need to avoid it. This reached its zenith as I trained as a cardio-thoracic surgeon in Belfast and Dublin, where the idea of error was simply anathema. The underlying message seemed to be: 'Don't make a mistake. If you do make a mistake, don't admit to it and don't make the same mistake again.' I think this attitude is fairly ubiquitous around the world. But my view on error was challenged when I left healthcare to retrain as an airline pilot in 1999. In aviation, there is the idea that error is inevitable, and therefore something integral to our whole Safety Management System. When I went through our Command Training and Check process in 2010, an arduous series of simulation training and real flights taking around two months, I gradually realised that the position of the captain, the person in command of the flight, was not all about technical aircraft knowledge (although obviously a certain level is essential) but more about the anticipation and management of error. My interest in error, and the broader Human Factors and ergonomics field encompassing it, was reborn. I realised belatedly that many of the ideas I'd suggested whilst a surgical trainee were in fact the same ones which aviation had embraced as the bedrock of their entire safety philosophy, and that healthcare could benefit hugely from the implementation of a similar approach. During the following year, 2011, I established Frameworkhealth Ltd, which focused on the transfer of the aviation approach to error management into healthcare with the aim of reducing avoidable harm from adverse events. What follows is an exploration of what I have learnt over the last decade from experts on the subject of error, and how we can use this learning in many areas including healthcare, transport and maybe even how we engage with our social and political leaders. This book will explore how we have become quite exposed by twenty-first century advances. We have progressed much faster than evolution can cater for, resulting in a brain structure and function which increases the likelihood of error. In the past this was of little consequence, and may even have been a good trade-off in...