Perception vs. Reality: The Psychology Behind Korean Air 801

Korean Air 801
Korean Air 801 wreckage. U.S. Navy/Petty Officer 3rd Class Michael A. Meyers

In the age of modern aviation psychology, we learn a lot of valuable information from analyzing early aircraft accidents from new perspectives. Though pilots are trained in crew resource management, human factors and maintaining situational awareness, there will always be an element of the man-machine interface that doesn’t succeed. For all of those failures, there will be a larger number of theories to be researched and practiced.

The human mind is so complex that no human-designed interface will ever perform perfectly. The popular focus of aviation accident prevention has long been on crew resource management and human and system errors, but there is also a need to discuss the psychology of perception as it relates to human error and risk management.

The crash of Korean Air Flight 801 is an excellent example of human error due to misperception. In 1997, an experienced crew flew a Boeing 747-300 into high terrain while conducting what they perceived as a typical approach. The NTSB determined that the probable cause was the poor crew performance, failure to brief the approach, and the failure of the “not-flying” pilots to monitor the approach.  While the NTSB presents clear and valid conclusions, there is a large disparity when it comes to identifying the real reasons for the crash - the reason behind the reason, if you will: the pilots made a grave error in perception.

The skewed mental awareness of the crew members led to a breakdown in situational awareness. Had this breakdown in situational awareness been identified and assessed, the tragedy may have been averted.

Perception & Experience

Perception is based on experience, knowledge, memories, emotional state and our values.

What we perceive at any given moment in time is not necessarily the same as reality, although it is our reality at that particular moment. The flight crew’s actions during Korean Air Flight 801 demonstrate this, but I think we're all guilty of it. Take me, for example.    

The term "situational awareness" is one that I will always have in the back of my mind, thanks to a brilliant flight instructor that I had during my initial instrument flight training. I can still hear the instructor’s voice saying things like, “Where are you? How far off centerline are you exactly? Where exactly is the runway in relation to you? Where is that other aircraft that just made his radio call?"

At first, I struggle to stay ahead of the airplane and would easily lose situation awareness. But toward the end of my instrument training, I always knew where I was. At any given time I could answer my instructor’s queries with specifics. 

When I was recommended for my instrument check ride, I was trained and ready. I was proficient. I had practiced approaches until they were perfect each time, and I could fly them with failed instruments, with an electrical failure or with an engine inoperative - and sometimes a combination of all three.

When the check ride came, I did everything according to plan. Step by step, I accomplished checklists, flew the departure procedure as I always did, and flew to the airport on my flight plan with no errors. Everything went as expected.  After three approaches, and at the peak of my workload, my examiner gave me an unusual scenario and instructed me to fly it. The scenario was so much different than what I expected that I had a hard time wrapping my head around it. Accepting the new instructions, I went on with flying the airplane. But I was already behind. I tried to keep up, and if it weren’t for my wrong turn, I'm convinced things might have worked out. But I turned the wrong way. I turned about 30 degrees in the wrong direction before I realized that the turn didn’t make sense. But it was too late.

I was on the unprotected side of a hold by then. My examiner shook his head and looked at me like I had already killed him.

What happened? My perception of the situation was based on what I knew, which was that for that particular approach, the turn in the hold was to the right. When I was given different hold instructions, I repeated them back correctly and envisioned how the hold would go. But my perception of how the hold should go was based on what I'd done before and not the reality of the situation. And as soon as one piece of the puzzle no longer fit, as soon as workload increased, I became unaware of the reality. I had gotten so comfortable with the procedures I knew and the typical approach profiles that I expected the same thing every time, and perceived the outcome to be a similar one. I expected the controllers to give me certain clearances. I expected my instructor to say certain things at certain times. I even expected the examiner to simulate an engine failure during my check ride. But a small difference in the typical approach I flew dismantled my mental model and unbeknownst to me, my perception conflicted with reality - at least at first.

This fallacy of the mind occurs in the mental models of experienced and inexperienced pilots alike, but perhaps it's even more common, or possibly just harder to detect, as we gain experience and create memories and experiences in the airplane. After flying the same procedures time and time again, at the same airport, it's not difficult to see how a pilot might lose some aspect of his situational awareness due to a distorted mental model - or distorted perception. Perhaps this situation can help explain what happened to Korean Air 801. 

While we're on the topic of perception, I'd be remiss if I failed to mention confirmation bias, the mental fallacy in which a person tends to process only the information that agrees with his or her current perception, while ignoring other valuable information. When he experiences confirmation bias, a pilot fails to see any contradictory evidence to their mental model, and will gravitate toward information that is consistent with their perception of what should be occurring, rather than what is actually occurring. One theory that explains this occurrence claims that while involved in a high-workload environment, pilots maintain failed mental models even after receiving erroneous data in order to remain in control of the situation. The pilot’s perception is that it is better to remain in control of the airplane and lose some situational awareness rather than take the time to gather data and adjust their mental model. According to Wicken’s model, our brains rely on much more than short-term memories and sensory information to form perceptions - it's a mix of short and long-term memories, values, and the current emotional state of mind.  

The pilot of Korean Air 801 fell victim to confirmation bias when they ignored valuable information and continued on an approach into Guam - an action that would kill 228 people. 

Korean Air 801

Korean Air Flight 801 departed Seoul, Korea at 0142 on the 6th of August, 1997 for Guam International Airport. It crashed into high terrain while flying a nonprecision approach to runway 6L. The airplane had two pilots, one flight engineer, 14 flight attendants and 237 passengers on board. Only 26 people survived.

At 0111 local time, the captain briefed a nonprecision approach for runway 6L, which included mention of the unusable glideslope. Upon joining the localizer at 0139 local time, the aircraft descended through 2,800 feet with the flaps extended 10 degrees. The first officer was heard saying “glideslope (unintelligible)…localizer captured…(unintelligible words)…glideslope did.”  Right after this, the controller cleared them for the ILS Runway 6L and reminded the flight crew that the glideslope was unusable. The first officer responded with “cleared ILS runway 6L” without acknowledging the unusable glideslope. At this point, we can begin to hypothesize that the crew’s shared mental model was unreliable.  The NTSB accident report states the following:

According to the CVR, about 0139:55 the flight engineer asked, “is the glideslope working? glideslope? yeh?” One second later, the captain responded, “yes, yes, it’s working.” About 0139:58, an unidentified voice in the cockpit stated, “check the glideslope if working?” This statement was followed 1 second later by an unidentified voice in the cockpit asking, “why is it working?” About 0140:00, the first officer responded, “not useable.”

A few seconds later, the altitude alert system chimed. All crew members ignored it. The descent continued through the established “step-down” altitude of 2,000 feet. No mention is made about the airplane’s distance from the runway, which is odd because the distance should’ve been a deciding factor in the descent altitudes chosen of flying the localizer only approach. This is the second indication of a breakdown in situational awareness among all crew members.

Another altitude warning was heard on the cockpit voice recorder. This also went unnoticed by the flight crew. While the captain initially called for the altitude to be set to 1440 feet (the lowest altitude allowed for an approach with an inoperative glideslope), the plane continued a descent while the flight crew prepared for landing. While accomplishing the landing checklist, another altitude alert sounded. The airplane descended through 1440 feet, and a voice was heard saying “isn’t glideslope working?” The flight continued a descent as if it were on glideslope. Again, no mention of step-down altitudes is heard, indicating the crew may have been relying on the glideslope as they would in a normal ILS procedure.

At 0142 local time, the Ground Proximity Warning System alerted the pilots with callouts: “minimums, minimums, sink rate, two hundred,” at which point a go-around was commanded. Unfortunately, it was too late. The plane impacted terrain.

The NTSB states that the probable causes of this accident were the pilot’s failure to brief the approach adequately, failure to execute an appropriate nonprecision approach, and the failure of the pilots not flying to monitor effectively. Contributing factors include fatigue and inadequate training.

Perception and Reality

The NTSB clearly identifies the above causal factors in this accident, but there is also a relatively hidden issue: the inability of the entire flight crew to recognize the failure of their very inaccurate shared perception. The situational awareness of the flight crew was deficient. One explanation for this is that the flight crew, having flown this route before and having been trained solely on precision instrument approaches, fully expected the outcome of this approach to share the same model of approaches previously flown. Even though the flight crew was told (and acknowledged) that the glideslope was unusable, they failed to execute the appropriate localizer-only approach. Though the captain had briefed the approach with mention of the failed glideslope, the actions of the crew tell us that the pilots’ mental model was very different than the verbalized understanding of the approach.

Adding to the crew’s failed mental model is the fact that the glideslope appeared to center as if it was, in fact, usable - a phenomenon known as a false glideslope. Instead of researching and identifying the glideslope as unusable, the crew “saw what they wanted to” on the glideslope indicator. Incidentally, the glideslope falsely lined up with the perceptions of what it should do if it was working correctly, and the crew assumed it was working without considering the possibility (and reality) of a false glideslope. The crew accepted the inaccurate glideslope information because it lined up with the perception of how it should be and how it had been in the past.

The flight crew was presented with many indications that something wasn’t right. However, they chose to ignore each one of those signs on the basis that their mental model was correct. The altitude alerts would’ve been unusual had they followed the correct approach procedures. Yet the sound of the altitude warning system didn’t seem to get the attention of the flight crew, since the perception seemed to be that they were “on glideslope.”  The distance to the runway or the distance to the VOR wasn’t mentioned at all between the crew members. Had they been using the correct procedures for a localizer-only approach, the distances would’ve been a vital part of the approach briefing and procedure. This tells us that they had a skewed mental model and a definite loss of situational awareness. 

Ultimately, the fate of Korean Air Flight 801 rests on the fact that the flight crew’s perceptions did not match reality, and nobody recognized it. The pilots flew an ILS approach into Guam while relying on an unusable glideslope because of a psychological trick that our minds play on us. Each of us could fall victim of such a fallacy if we aren't careful to recognize when our perception might be different from reality. 

Sources:

  • Besnard, D., Greathead, D., & Baxter, G. (2004). When mental models go wrong: co-occurrences. International Journal of Human-Computer Studies
  • FAA Human Factors Awareness Course. FAA Human Factors: www.hf.faa.gov/Webtraining/Cognition/CogFinal009.htm
  • Moray, N. (1987). Intelligent aids,mental models and the theory of machines. International Journal of Man-Machine Studies , 619-629.
  • NTSB. (2000). Controlled Flight into Terrain, Korean Air Flight 801, Boeing 747-300, HL7468, Nimitz Hill, Guam, August 6, 1997. 
  • Shapell, S., & Wiegmann, D. (2000). The Human Factors Analysis and Classification System- HFACS.