This post is about aviation. But really it’s about institutional self-correction. I intend it as a Part 1 setup to a Part 2 post coming up, about media and politics.
[DAY-AFTER UPDATE: As noted in some of the reader-comments below, the summary does not include all the range of institutions, organizations, businesses, agencies, labor unions, universities, etc that have been part of the safety evolution in American aviation. I should have been clearer that I was mainly using the aviation achievement as a set-up to talk about the larger process of a “learning culture.” Ten years ago I published a book, China Airborne, that went in detail into these “system” issues and achievements —as part of asking whether the Chinese system would be able to match them. Thanks to the useful prod from commenters.]
The connecting theme is how to learn from mistakes — as individuals, as companies and organizations, as a larger culture. Today I’ll discuss what happens what individuals and institutions do learn. Next, what happens when they don’t.
Summary version: Modern aviation is so incredibly safe because aviation has been so thorough and unsparing about facing and learning from its errors.
Other institutions, notably the press, have a lot to learn from this example. If you’re interested in life-or-death in the skies, you may find the three videos below worth watching. Even if you’re not, please accept this set-up as part of an exploration of what other institutions might do.
One of many miracles we don’t talk about enough: safe travel through the skies.
An under-appreciated miracle of modern society is how safe and reliable developed-country airlines have become. On a statistical basis, being aboard a North American or Western European airliner is about the safest thing you can do with your time, compared even with taking a walk or sitting in a chair1.
A big-picture illustration: Over the past 13-plus years, U.S. airlines have conducted well over ten billion “passenger journeys” — one person making one trip. And in those years, a total of two people, of the ten billion, have died in U.S. airline accidents. For comparison: on average two people in the U.S. die of gunshot wounds every 25 minutes around the clock. And two more die in car crashes every half hour. (Around 45,000 Americans died last year of gunshots, and around 42,000 in car crashes.)
How could the aviation system possibly have managed this? Airplanes weighing close to one million pounds hurtle into the sky, carrying hundreds of passengers who are separated by sheets of aluminum and plastic from air so cold and thin it would kill them quickly on exposure. Passengers gaze out at engines each up to 1/10th as powerful as those that sent Apollo 13 toward the moon. At the end of the journey the pilots bring the plane down on a precise strip of pavement—perhaps 60 seconds after the plane ahead of them in the queue, 60 seconds before the next one. And we take it all for granted—grumbling about the crowds and the hassle and the pretzels and the leg room, but safe.
The origins of this ongoing safety revolution is well chronicled; I spent several chapters on it in my book China Airborne. My point for now involves the aviation world’s relentless, unsparing, de-personalized, and highly systematized insistence on learning from whatever makes the system fail.
—On an informal level, this involves aviation magazines, newsletters, websites, and seminars—90% of which have titles like “What went wrong?” or “Breaking the accident chain.” It may sound counter-intuitive, but if you love flying and being in the air, much of your avocational reading will be articles in the “Anatomy of Disaster” category.2
—On the formalized level, we have the National Transportation Safety Board, or NTSB. Whenever there’s news of a transport tragedy, we hear that NTSB investigators are on their way to the scene. (Update: plus of course the FAA, NASA, the National Weather Service, Air Traffic Controllers, state and local aviation authorities, flight schools and flight examiners, unions representing pilots, other flight crews, maintenance experts, and on through a very long list.) Nearly two and a half years ago, I wrote a long story in the Atlantic that was a thought-experiment of how the NTSB would assess the U.S. reaction to the Covid epidemic.
To get more specific, let me cite three videos as examples of how the aviation world tries to learn when things go wrong. None of these involves an airline crash, precisely because in the U.S. those have become so rare. But they show the cast of mind behind systematic learning from failure.
1. How you begin, when you still don’t know.
The video below is from an ongoing series by the Air Safety Institute, part of the Aircraft Owners and Pilots Association, or AOPA.3 The narrator is Richard McSpadden, a well-experienced pilot. Among other credentials, he has been flight leader of the U.S. Air Force’s Thunderbirds air-show team.
The video came out just a few days after a news-making crash last month at Dallas Executive Airport, south of downtown Dallas, when one World War II-vintage aircraft rammed into another, in full view of the horrified crowd. Both planes plunged to the ground just beyond the audience, killing all six people aboard both planes. (We’ve landed at this very nice airport a couple of times, including on a recent family visit to Dallas.)
If you start watching McSpadden’s video, I think you’ll find it interesting. I’m mentioning it now because of its careful distinction, soon after the accident, among the known, the probable, and the yet-unknowable. He explains hypotheses that might explain what happened; he suggests the direction of likely inquiries; but he is careful to observe the line between fact and speculation. And to avoid any “hot takes” or predictions.
I found it worth seeing the whole presentation—but then, I would. If you skip to around time 8:50, you will see McSpadden illustrating the path of the colliding aircraft with what William Langewiesche once told me was the trademark hand-language of anyone who has flown a plane.
But here is the main point: people in aviation will certainly learn from this episode. Because of careful analysis of this airshow disaster in 2022, it’s less likely that anything of the sort will ever happen again. And the learning-from-error is happening in a systematic way.
Imagine that occurring in other institutions.
2. What you say, once you know.
The video below is also from the Air Safety Institute. It is a dramatized recreation of a small-plane crash in 2019. It came after the formal NTSB investigation had been completed, which can take a couple of years, and after the ASI people had time to illustrate and dramatize the circumstances.
On paper, this crash was the most stereotyped and predictable kind of small-airplane tragedy. A pilot who was flying “VFR”—under Visual Flight Rules, which requires being able to see where you are going, and not be in clouds—entered a storm system and its clouds. He became disoriented, lost control of the plane, and died. This is essentially the JFK Jr. plane-crash story.
In the flying world, learning of disasters like this is similar to reading that a carful of high schoolers had crashed at 3am after the prom, with six-packs of beer in the front seat. It’s more heartbreaking because it’s so familiar.
The difference is that this incident involved a pilot with more than 10,000 flying hours—quite a lot. You can hear him in the video first talking to an air traffic controller around time 1:25. At around time 2:45 of the video, you’ll hear a description of what specifically happened when he entered the clouds. And then, starting at around time 5:45, the video goes into more explicit “what can we learn here?” lessons.
Again, if you’re among the tiny fraction of Americans who fly airplanes, you’ll find this interesting. But I hope anyone will find it instructive in the way organizations with serious responsibilities, take their responsibilities seriously. Including, or especially, when they fail.
No one should be above criticism. No matter their rank or experience or connections. The stakes are too high. And this principle should apply more broadly than to aviators.
3) Making sure individuals know, and take responsibility.
I’ll mention just one more video. It’s a Twilight Zone-style creepy one put out by Australian air-safety officials long ago.
The video is 178 seconds long, and its title is “178 Seconds to Live.” Aviation studies had shown that 178 seconds was the average time it took a non-instrument rated pilot to lose control of the plane, and crash, after entering the clouds. You’ll get the idea if you watch even a little.
What’s the point here? It is that institutions that have become safe and responsible have gone to extraordinary lengths to build both top-down and bottom-up awareness of what is at stake.
I will stop here now to sum up the premise: We can learn from examples of success as well as failure. An institution with major economic, social, civic, and other effects has found ways to make itself far better at what it does.
Next up: applying that example in other realms.
I gave some of the numbers in a New York Times piece last week. More details on these statistics:
The most recent mass-fatality crash of a U.S. airliner was in February, 2009, almost 14 years ago. That was when a Colgan commuter plane crashed on approach to the Buffalo airport during an ice storm, killing all 49 people aboard plus one on the ground. Since then, a total of two people have been killed in two unrelated airline incidents. The year-by-year statistics are here: https://www.airlines.org/dataset/safety-record-of-u-s-air-carriers/.
To clarify “passenger journeys”: If one flight carries 100 passengers, that represents 100 passenger journeys. If a single person takes 50 round-trip flights per year, that also comes to 100 passenger journeys. This is how there could have been more than 10 billion U.S. passenger journeys since 2009.
Also to clarify a partial overlap in the figures: Some apply to U.S. airlines (Delta, American, Southwest, and so on), and some apply to operations to and from U.S. airports (which include non-U.S. airlines, from Lufthansa to JAL). The safety figures are for U.S. airlines, and thus exclude the three people who died nearly 10 years ago when an Asiana airliner, from Korea, had a violent landing at San Francisco airport. The flight volume figures include all operations to and from U.S. airports, most but not all of which are by U.S. carriers.
“Developed-country” airlines refers to scheduled-airline operations in most of the Americas, Western Europe, Oceania, and northeast Asia. Much of my book China Airborne, from ten years ago, concerns how mainland China’s air operations went from the “very dangerous” category to “very safe.” Cooperation with teams from the U.S.—representing Boeing, the FAA, the NTSB and other agencies, United and other airline representatives, senior pilots, etc—was a fundamental part of the change.
The gunshot death totals include those from suicide, usually more than half, and murder, accident, and other causes.
The classic must-read volume on learning to fly is Stick and Rudder, by Wolfgang Langewiesche, father of my friend and colleague William, which was published in the 1940s but is remains the standard text. An important runner-up is The Killing Zone, by Paul Craig, on why the statistically most dangerous stage of flying is for pilots who have between 50 and 350 hours of flying experience—enough to feel comfortable, not enough to know what they don’t know.
To provide more context: I have more than 2,000 hours of “pilot in command” flight time in our single-engine Cirrus aircraft. But everything about flight safety involves “frequency and recency.” For various reasons I have not flown the plane since August. So I will go through re-training flights and an “Instrument Proficiency Check” flight before doing anything on my own.
The AOPA is a member-service group but also a lobbying and advocacy organization. Because of its relentlessness about the interests of small-plane aviation, for instance opposing user fees of any kind, I have referred to it as “the NRA of the skies.” I disagree with some of its political views, as its officials are aware; I also highly value and respect its safety-oriented work and other projects and am a longtime member.
Impressive thinking. Great article.
My own experience as an aviator of 57 years and leading/following various corporate and nonprofit entities suggests that there are MANY places which could benefit from the aviation mindset: “what went wrong?” It’s far more frequent to observe (and I try to avoid) the accusatory and much less useful “whose fault was it?”
Al Ueltschi founded FlightSafety Aviator simulator training as is well known. Less well known are the various Marine Safety and Nuclear Safety simulator training. Similarly, medicine is learning from aviation. Read “Josie’s Story”. The author is a good friend’s daughter. I was invited to hear her speak to the UVA medical teams. Her message: “stop blaming people and start fixing broken systems that allow failure to creep in”. Exactly as we look at aviation issues, there’s almost always a “accident chain” of events preceding a tragedy.
I can’t think of an industry that wouldn’t benefit from adopting What Went Wrong (Lets Fix it).
This is all correct, as far as it goes, but it’s incomplete. While the NTSB is mentioned, the FAA is not discussed, and the post gives the impression that the aviation community has organized this tremendous improvement in safety mostly on its own. The role of the government in achieving this improvement in safety is largely ignored.