Turn below into a 5-page case study. It shouldn’t be too hard, there’s already 2 to 3 pages of info in it. Just copy & paste and expand it enough to get it to 5 pages. Throw in extra references if you need to. The slides already has 2 references.
Selected accident report is the accident of AF447 which occurred on 1st June 2009 in the Atlantic Ocean and killed 228 passengers. The plane was travelling from Rio to Paris.
Review of Factual Information
AF447 was three and a half hours into a night flight over the Atlantic. Transient icing of the speed sensors on the Airbus A330 caused inconsistent airspeed readings, which in turn led the flight computer to disconnect the autopilot and withdraw flight envelope protection, as it was programmed to do when faced with unreliable data.
A string of messages appeared on a screen in front of the pilots, giving crucial information on the status of the aircraft. All that was required was for one pilot to maintain the flight path manually while the other diagnosed the problem.
The pilot attempted to correct a slight roll that occurred as the autopilot disconnected but over-corrected, causing the plane to roll sharply left and right several times as he moved his side stick from side to side. He also pulled back on the stick, causing the plane to climb steeply until it stalled and began to descend rapidly, almost in free-fall.
Neither pilots recognized that the aircraft had stalled despite multiple cues. In the confusion, they misinterpreted the situation as meaning that the plane was flying too fast and actually reduced the thrust and moved to apply the speed brakes (the opposite)
The pilots made simultaneous and contradictory inputs, without realizing that they were doing so. By the time the crew worked out what was going on, there was insufficient altitude left to recover, and AF447 crashed into the ocean, with the loss of all 228 passengers and crew.
Analysis and Evaluation of the accident
The AF447 tragedy starkly reveals the interplay between sophisticated technology and its human counterparts. This began with the abrupt and unexpected handover of control to the pilots, one of whom, unused to hand flying at altitude, made a challenging situation much worse.
A simulation exercise after the accident demonstrated that with no pilot inputs, AF447 would have remained at its cruise altitude following the autopilot disconnection.
The possibility that an aircraft could be in a stall without the crew realizing it was also apparently beyond what the aircraft system designers imagined. Features designed to help the pilots under normal circumstances now added to their problems
For example, to avoid the distractions of false alarms, the stall warning was designed to shut off when the forward airspeed fell below a certain speed, which it did as AF447 made its rapid descent.
However, when the pilots twice made the correct recovery actions (putting the nose-down), the forward airspeed increased, causing the stall alarm to reactivate. All of this contributed to the pilots’ difficulty in grasping the nature of their plight. Seconds before impact, Bonin can be heard saying, “This can’t be true.”
Automation and Aviation safety
The same technology that allows systems to be efficient and largely error-free also creates systemic vulnerabilities that result in occasional catastrophes – is termed “the paradox of almost totally safe systems.”( Martins et al. 2012) This paradox has implications for technology deployment in many organizations, not only safety-critical ones.
One is the importance of managing handovers from machines to humans, something which went so wrong in AF447. As automation has increased in complexity and sophistication, so have the conditions under which such handovers are likely to occur.
Second, how can we capitalize on the benefits offered by technology while maintaining the cognitive capabilities necessary to handle exceptional situations?
Is it reasonable to expect startled and possibly out-of-practice humans to be able to instantaneously diagnose and respond to problems that are complex enough to fool the technology? This issue will only become more pertinent as automation further pervades our lives, for example as autonomous vehicles are introduced to our roads.
Commercial aviation offers a fascinating window into automation, because the benefits, as well as the occasional risks, are so visible and dramatic. But everyone has their equivalent of autopilot, and the main idea extends to other environments: when automation keeps people completely safe almost all of the time, they are more likely to struggle to reengage when it abruptly withdraws its services.
Automation also leads to the subtle erosion of cognitive abilities that may only manifest themselves in extreme and unusual situations (Wiener et al.2014)
Prevention of the Accident
The pilots of AF447 failed to properly diagnose the severity of the problem because the Pitot tube was sending inaccurate data to the cockpit and the accident could have been prevented if the pilots had been adequately trained.
The accident could have also been prevented by use of enhanced autopilot system.
Following the AF447 disaster, the FAA urged airlines to encourage more hand-flying to prevent the erosion of basic piloting skills and these points to one avenue that others might follow.
Martins, E., & Soares, M. (2012). Automation under suspicion–case flight AF-447 Air France. Work, 41(Supplement 1), 222-224.Wiener, E. L. (2014). Cockpit automation. In Human factors in aviation (pp. 433-461).