Wednesday, January 22, 2014

How that 747 landed at the wrong airport in November

Almost exactly two months ago to the day (November 21, 2013), I posted one of the first stories on the entire Internet giving my best speculation as to how a cargo-laden 747 landed at the wrong airport in Kansas. There were practically no details available at the time, so the only stories were that it happened, not how. While the Internet cried out, "Stupid pilots!", I laid out my reasons why a crew who was highly experienced and highly non-stupid could easily make a mistake like that.

The thrust of my speculation at the time was that it was probably due in large part to confirmation bias, where we (yes, you do it, I do it, everyone does it to a greater or lesser extent) give weight to information that supports our idea and discard that which doesn't. A couple of weeks ago, Aviation International News posted a story with more details about exactly what happened that night, and it turns out that confirmation bias played a large role in the incident, enhanced by some equipment issues that led the pilots to not fully trust the computer's report of their position.

As with almost all incidents in which humans are involved, the pilots did some things well and could have done some things better. And again, as with most incidents, there was no single huge mistake that led to what happened. It's nice to think of incidents and accidents as having one ultimate cause, since it makes for easy media stories and makes things simple to understand, but in real life that's almost never the case. Instead, incidents are almost always more like the famous proverb about a horseshoe nail leading to the loss of an entire kingdom.

In the "good" column we can chalk up a few things. First, the pilot was flying an instrument approach even though the weather was fine. Using all the equipment available to help you out is always a good idea, and following something relatively simple like the RNAV/GPS 19L approach helps reduce the cockpit workload by having the plan for getting to the airport already figured out ahead of time and then letting the autopilot fly that plan.

Second, the pilot reported that "previous VFR approaches to McConnell had often put him at a higher altitude than expected and that difficulties in picking out McConnell’s runway prompted him to make an instrument approach." This means that he is already thinking ahead of the aircraft, considering what to expect, and planning for it, all of which is a good thing. There is a time-honored aviation adage which goes, "Never let an airplane take you somewhere your brain hasn't already been."

Third, the problems they encountered with the FO's (First Officer—often called the "co-pilot" outside of aviation circles) primary flight display earlier in the flight led them to take its indications with a grain of salt. This is a good thing, because it means that instead of blindly trusting the computer and being highly-paid passengers with the two best seats on the plane, they were using their judgment and flying the aircraft with the assistance of the autopilot, not the insistence of it.

Although both the second and third items are usually very good things in most cases, in this rare instance they ended up contributing to the incident. Both of them would have been perfectly fine if there wasn't an airport with a similar runway configuration right along the flight path of their instrument approach, and we never would have heard of this crew.

Unfortunately, in this case, they increased the power of confirmation bias. Since the pilot had ended up higher than expected in the past, and since the avionics had had some issues earlier in the flight, when he saw a runway oriented in the direction he was heading for that the airplane was also high for, it was a perfectly human reaction to disengage the autopilot and head for it. That perfectly human thing is exactly what the pilot did.

The other pilot did not say anything as the flying pilot made his approach. It's easy to Monday-morning quarterback and ask why he didn't speak up, especially since in the United States, crews tend to have a much more equal hierarchy than Asian crews or even U.S. crews from the old days. While there are still some overbearing captains out there, they are for the most part long gone, as many of the CRM (Crew Resource Management) principles that are standard nowadays were figured out through hard experience and written in blood. This cockpit was probably very close to equal, since I would be surprised if both pilots did not have over 10,000 flight hours each, meaning even the "junior" member was no novice. (I have no figures for the actual amount, so this is a pure SWAG on my part.)

With that in mind, why didn't he say something? For the same reason you didn't when you were in a similar situation. Yes, I'm sure you can think of a time when someone was doing something that you weren't sure about, but you kept quiet because you thought to yourself, "They must know something that I don't."

There are a couple of things in the "not so good" category. Just as the second and third good ones above would normally be fine, these two things would normally not have been embarrassing to have overlooked. Then again, the whole reason for doing them is precisely to prevent something like this.

However, psychology still comes into play here. Even if they did brief the other airports in the vicinity and the lighting systems, it's still quite possible that it would have been just a rote recitation. After all, this wasn't the first time they had been here, and after doing something over and over, complacency sets in. The words come out of the mouth but don't enter the brain.

I pre-emptively addressed the ATC "failure" in the original post, but it's no surprise to see it here. I already pointed out in the original post that the tower controller probably wasn't watching the screen like a hawk. After all, it's not ATC's job to fly the plane to the correct airport. ATC's job is to keep planes from banging into each other, not to keep them from banging into the wrong patch of concrete.

I doubt Atlas Air's new requirement that pilots remain on an IAP (instrument approach procedure) until the final approach fix would have done anything at all to prevent this from occurring. After all, the pilots already didn't trust the avionics to tell them where the final approach fix was any more than they would have to tell them where the initial approach fix was. If they did trust the computers, then they wouldn't have abandoned the approach in the first place. It's a new requirement that sounds nice but doesn't address the cause.

In the end, I'm not saying that there was no way this couldn't have happened. In fact, there are more ways this could have not happened than there are ways that it could have. It took several factors all coming into alignment to make this take place. What I am saying is that these pilots were not idiots, they were two humans doing what humans do best: being imperfect. There is no procedural or technological solution that will ever conquer basic psychology, although that certainly won't keep people from trying. There are lessons to be learned from this, and we can all be happy that, unlike many aviation lessons, they are not ones written in blood.

1 comment:

  1. Excellent application of "confirmation bias." "IT" is everywhere people are, .e.g., religion, politics, history, law, investing, police, and...now aviation! Confirmation bias is why science is so difficult for so many, it forces us to consider evidence that disconfirms our beliefs. Of course one can not overlook the possibility of "hindsight bias" on the part of the author :)

    ReplyDelete