This is a long overdue post, and a lot has happened since the last one published in September, and I don’t just mean developments in the aerospace community. I was kept busy throughout the semester with my senior design class (many all-nighters), but most importantly, I graduated from the University of Michigan in December with my Aerospace Engineering degree. I took some time to myself to travel the last month (highly recommend visiting southeast Asia), but I will continue to work full-time as an Analyst with ICF SH&E in Ann Arbor beginning this month.
Anyway, this post is a continuation of Oil, Training, & Globalization Part 1, where I briefly hypothesized about the impacts of a dramatic fall in oil prices on the industry, particularly commercial aviation. I ended it with a few follow-up bullet points, and I will cover one of those in today’s post: how cockpit automation is impacting flight safety and what it means for pilot training.

Commercial aviation flight decks have become increasingly automated as more information and capabilities are available to pilots
Since the beginning of the jet age in the 1950s, flight deck technology has progressed to improve efficiency, operations, and safety. Analog instruments have been mostly replaced by their LCD digital counterparts, and primarily serve as backup instruments on new production aircraft today. New capabilities have also been introduced to the flight deck, such as TCAS (Traffic Collision & Awareness System), ACARS (Aircraft Communications Addressing & Reporting System), satellite telephone connection directly to an airline operations center, weather radar, moving maps, EICAS (Engine Indication & Crew Alerting System), electronic flight bags, and much more. These improvements have been seen across all types of aircraft, from commercial Boeing 747s to privately owned Cessna 172s. The implementation of these technologies has led to increased situational awareness and resources for pilots today, which has contributed to record-setting safety statistics.
While cockpit technologies have undoubtedly improved safety, there is one question to consider in light of all these developments: are increasing levels of automation decreasing basic flying skills of pilots? There is some evidence to suggest this may be the case, especially in the recent cases of Air France 447 and Asiana 214. The implication is that, while automation has certainly increased situational awareness (aka “the big picture”), it also poses a human factors problem, that is, a problem with the interaction between pilots and their flight deck surroundings.
Let’s start with Air France 447. To recap, the Airbus A330 was flying overnight from Rio de Janeiro to Paris in June 2009 when it encountered thunderstorm activity over the Atlantic Ocean. The aircraft accumulated ice on the pitot tubes, devices that measure pressure differential to calculate airspeed, rendering them useless. As a result, the autopilot received erratic airspeed readings and, not knowing what to believe, automatically disconnected. At this point, the aircraft was still in a perfectly flyable condition, but for reasons unknown, the co-pilot at the controls deviated from the cruise altitude, entered a steady climb, and eventually entered a high-altitude stall from which the aircraft would never recover. Captain Sullenberger, the pilot responsible for the safe landing of US Airways Flight 1549 in the Hudson River, demonstrates the sequence of events below and notes that it is difficult to explain the First Officer’s actions.
During the entire three minute fall from 35,000ft to the Atlantic Ocean, there was not one mention between either pilot that the aircraft was stalling, despite the clear audio and visual cues. A stall is a relatively simple maneuver, and pilots are taught to identify the signs and how to recover from a stall from the first day of flying. Had the two pilots correctly identified the stall condition, it would have been a simple recovery: push the nose down slightly to regain airspeed and lift over the wings. Did the Airbus’ fly-by-wire system build a false sense of confidence that the aircraft would never stall? Did the pilots ascertain that the stall protection was disabled and that the aircraft was operating in the so-called “alternate law”? In the final report, the French accident investigation agency, Bureau d’Enquetes et d’Analyses (BEA), noted that the crew never manually re-trimmed the aircraft after it entered the initial climb, which caused the horizontal stabilizers to remain in a 13-degree nose-up position for the remainder of the flight and exacerbating the stall. The agency also noted that the pilot may have over-relied on the system to automatically correct itself, and that manual trimming is rarely used in-flight on Airbus equipment.

The Captain of Asiana 214 flying the 777 seemed to have difficulties maintaining a safe speed during a visual approach to SFO
Another case to consider is that of Asiana 214, a Boeing 777 which crashed upon landing at San Francisco International Airport at the end of the flight from Seoul, South Korea (I was actually in San Francisco when this happened). The conditions that day could not have been better – clear skies, light winds, and unlimited visibility. Under such conditions, all arrivals were performing visual approaches, a landing conducted by reference to the ground only, instead of relying on the flight instruments to guide the aircraft to the runway. The pilot flying was a Captain who was finishing up his transition to the 777. During the approach, the airspeed steadily decreased below the stable approach speed, and was dangerously low as the aircraft approached the runway threshold. By the time a go-around was called to abort the landing, it was too late and the aircraft crashed onto the runway. Three passengers were killed in the accident.
The recovered Cockpit Voice Recorder and Flight Data Recorder seem to indicate that the accident was a case of “mode confusion”, a phenomenon when the pilot becomes confused about what exactly the autopilot is doing and what must still be done manually. At some point, the auto-throttles were disconnected, but the Captain missed the cue and did not realize that the speed was under manual control, leaving the throttles in the idle position. As the speed decreased and the nose pitched up, the aircraft approached the dangerous stall condition at low altitude before ultimately hitting the seawall at the runway end. This unfortunate accident again represents a case where a very experienced flight crew falls behind the aircraft and becomes confused about what the aircraft is doing and why. In a profession where quick decision-making skills are paramount, every extra second spent deciphering the automated systems can make the difference between survivable and deadly outcomes.
Perhaps even more alarming was that, in a post-accident interview, the Captain flying the aircraft stated that he found it “very stressful” to perform a visual landing without cues from the Instrument Landing System (ILS) to guide him to the runway. This revelation most certainly reveals an over-reliance on flight deck automation, as a visual approach is a part of basic airmanship learned from the first flight of a pilot’s career. Visual approaches are performed in the majority of landings, unless the weather is poor, and should be a well-practiced routine. Even worse, the inability to perform visual approaches may be fairly widespread, especially among international pilots. See the reconstruction of the accident in the video below, compared against a normal approach profile (air traffic control recording from the accident overlayed).
So what is going on here? In this author’s humble opinion, there is a growing chasm between increasing levels of cockpit automation and training standards. Pilot training standards and methodologies, for the most part, have not kept up with the pace of new technologies seen in flight decks. While the additional capabilities and technologies offer new resources to pilots than ever before, they also represent new swaths of information and data being thrown at pilots during a given flight, and it can be quite overwhelming. In other words, pilots have not been trained on how to timely and effectively handle these new sources and amounts of information being thrown at them when things go south. For the non-pilots out there, imagine driving your car on an icy road with ten babies screaming at the top of their lungs and shaking their rattles. You can’t think clearly and all you are trying to do is focus on keeping your car straight and not slide off the road. Some also make the metaphor of drinking from a fire hose. That is what a highly automated cockpit environment may seem like when things start to go wrong. Now add in fatigue after a 14 hour transpacific flight, and you can imagine how the problem is amplified.
Research on these “data-rich” flight decks shows some areas of concern. For example, a University of Michigan study, Effects of Modern Primary Flight Display Clutter: Evidence from Performance and Eye Tracking Data by Prof. Nadine Sarter (I actually got to participate in this study), examines the effects of clutter (i.e. squeezing more information onto a screen) on flight deck displays and how it impacted pilot response times to warnings shown on the displays. Not surprisingly, increased clutter led to increased response times and missed warnings, especially during high-workload periods. Moreover, the study shows that pilots fail to recognize the negative impact of increased clutter and are terrible judges of what a “decluttered” display might be. The conclusion states that, “Even if pilots were allowed to ‘declutter’ the display at this point, the fact that the mean rating of clutter for the high-clutter display was still only around 5/10 suggests that pilots may underestimate the degree of the phenomenon. They may therefore not realize the need to take corrective action. Along the same lines, the majority of participants in this study stated that, in all clutter conditions, the amount of information on the PFD was just right. This raises questions about entrusting pilots with judging and adjusting PFD clutter.”
I’m not by any means trying to sensationalize this phenomenon, imply that pilots are incompetent, or that planes will begin dropping out of the sky. Air travel will continue to be the safest form of travel, despite these challenges. However, one must still consider the implications of increased cockpit automation and what it means for pilot training. First, operators will begin to emphasize manual flying outside the simulator as much as possible in order to reinforce basic flying skills and to not lose touch with the aircraft, although automation will continue to be used to reduce pilot tasks during high workload periods. At the same time, pilots must be taught to think almost like engineers in order to be able to cope with the large amounts of data being thrown at them in bad situations, which is very challenging since time is more often of the essence. In the longer term, this could mean an adjusted FAA-approved training syllabus for automated environments. Lastly, you can bet that aircraft and systems manufacturers (I’m looking at you Honeywell, Rockwell Collins, and other avionics suppliers) will be re-examining their automation levels and correlating them to the human factors impact.