Systems & complexity

Are the claims of “psycho automation” in regard to Qantas flight QF72 justified?

In media stories last week, for example this one in The West Australian, former airline Captain Kevin Sullivan broke his long silence on what happened on Qantas Airways flight QF72 in October 2008.

Traveling from Singapore to Perth, the Airbus A330-300 aircraft suddenly lost altitude over the north-west of Western Australia, causing unrestrained passengers and crew to be flung around the cabin. The injuries were serious, and included fractures, lacerations and spinal injuries. Captain Sullivan called a mayday and made an emergency landing at the remote Learmonth Royal Australian Air Force (RAAF) base. At least 110 of the 303 passengers and nine of the 12 crew members had been injured. Twelve people were seriously injured and another 39 needed hospital medical treatment.

Last week’s media stories make for harrowing reading, describing both the frantic actions of the crew to attempt to regain control of the aircraft and the traumatic experiences of the injured and their loved ones:

Booooom. A crashing sound tears through the cabin. In a split second, the galley floor disappears beneath Maiava’s feet, momentarily giving him a sense of floating in space. Blood rushes to his head as he, the off-duty captain and his wife are propelled into the ceiling, knocking them out.

In the cockpit, Sullivan instinctively grabs the control stick the moment he feels the plane’s nose pitch down violently at 12.42pm (Western Australia time). The former US Navy fighter pilot pulls back on the stick to thwart the jet’s rapid descent, bracing himself against an instrument panel shade. Nothing happens.

The media stories call into question the role of automation in the incident, carrying strongly emotive statements, for example the headline and sub-headline in this article in The Canberra Times:

The untold story of QF72: What happens when 'psycho' automation leaves pilots powerless?
These sensational “psycho automation” headlines recall the Ghost in the Machine episode of the popular science fiction television series The X-Files, in which a malevolent artificial intelligence starts killing to protect itself. Such headlines play on the deep-seated fear that many people have of automation and artificial intelligence (AI).

Are the fears of “psycho automation” justified in this case, or are the articles just scaremongering?

The 2011 Australian Transport Safety Bureau (ATSB) report into the incident certainly finds that the aircraft’s automated systems were responsible, concluding that:

While the aircraft was in cruise at 37,000 ft, one of the aircraft’s three air data inertial reference units (ADIRUs) started outputting intermittent, incorrect values (spikes) on all flight parameters to other aircraft systems. Two minutes later, in response to spikes in angle of attack (AOA) data, the aircraft’s flight control primary computers (FCPCs) commanded the aircraft to pitch down.

and,

Although the FCPC algorithm for processing AOA data was generally very effective, it could not manage a scenario where there were multiple spikes in AOA from one ADIRU that were 1.2 seconds apart.

While the article in The Canberra Times accurately reports these findings, it unfortunately then selectively quotes other report findings to reinforce the “ghost in the machine” notion.

For example, in The Canberra Times:

While finding a “failure mode” affected the air-data unit, investigators cannot pinpoint the exact mechanism that triggered the stream of incorrect data. They reason that the failure mode was “probably initiated by a single, rare type of trigger event”. The investigation pored over potential triggers such as a software bug or hardware fault but found them all unlikely.

and,

The inability to pinpoint the trigger leaves a crucial question unanswered. The air-data unit was taking good information in and pumping out extreme data. “They don’t know why it did that. And there is no result,” Sullivan says.

By contrast, the ATSB report does discuss another potential trigger:

The other potential triggering event was a single event effect (SEE) resulting from a high-energy atmospheric particle striking one of the integrated circuits within the CPU module. There was insufficient evidence available to determine if an SEE was involved, but the investigation identified SEE as an ongoing risk for airborne equipment.

SEE in avionics systems has been identified as an important issue:

Atmospheric radiation is an issue for avionics designers today, with every indication of becoming a greater issue in the future. Atmospheric radiation causes single event effects (SEE) in electronics, resulting in various system failure conditions, including hazardous misleading information.

The ATSB report also includes the following statements not referenced at all the The Canberra Times article:

The occurrence was the only known example where this design limitation [where the FCPC algorithm could not manage a scenario where there were multiple spikes in AOA] led to a pitch-down command in over 28 million flight hours on A330/A340 aircraft

and,

There were only three known occasions of the failure mode in over 128 million hours of unit operation.

These statements show that while there is certainly a serious problem that needed to be addressed, the chances of automated systems failure is extremely small. Aircraft manufacturer Airbus has also taken steps to address the ATSB report findings with the aim of preventing further such incidents.

Rather than playing to people’s fears with an emotive “psycho automation” pitch, The Canberra Times could have made a much more productive contribution to the aircraft safety debate by looking at the incident in the context of an important issue: the fundamental design philosophy differences between Airbus and Boeing.

Writing in The Avionics Handbook, Kathy H. Abbott of the United States Federal Aviation Administration summarises a key difference between Airbus and Boeing1:

One of the significant differences between the design philosophies of the two manufacturers is in the area of envelope protection. Airbus’ philosophy has led to the implementation of what has been described as “hard” limits, where the pilot can provide whatever control inputs he or she desires, but the airplane will not exceed the flight envelope. In contrast, Boeing has “soft” limits, where the pilot will meet increasing resistance to control inputs that will take the airplane beyond the normal flight envelope, but can do so if he or she chooses. In either case, it is important for the pilot to understand what the design philosophy is for the airplane being flown.

The Airbus flight envelope creates safe limits beyond which the aircrew can’t go, and at first glance it would be hard to think that this could be anything other than an excellent idea. However, the problem with the flight QF72 Airbus incident is that the automation failure created an erroneous and dangerous flight envelope that the aircrew then had to fight against to try to regain control of the aircraft. If flight QF72 had been a Boeing aircraft experiencing a systems failure then the aircrew would have been able to override the aircraft systems and immediately regain control.

Is the flight QF72 incident good grounds for Airbus to revisit or even abandon its flight envelope approach? In its article, The Canberra Times could have sought a wide range of expert opinion on this issue and potentially drawn conclusions or recommendations. There are already aircrew who favour Boeing because the pilot has the ability to go beyond the flight envelope if that’s needed. For example, airline pilot blogger Captain Lim states that “I have flown the Boeing as well as the Airbus A300s and I prefer the philosophy of Boeing that gives the pilot ultimate control.”

The article in The Canberra Times already had excellent lead-ins to such commentary. For example, this statement in regard to the flight envelope:

The flight control computers – the brains of the plane – are supposed to keep the plane within an “operating envelope”: maximum altitude, maximum and minimum G-force, speed and so on. Yet against the pilots’ will, the computers are making commands that are imperilling all on board.

Sadly, another opportunity for serious analysis lost to emotive sensationalism.

Reference:

  1. Abbott, K. H. (2001). Human factors engineering and flight deck design. In The Avionics Handbook. CRC Press.
5/5 - (1 vote)

Also published on Medium.

Bruce Boyes

Bruce Boyes (www.bruceboyes.info) is a knowledge management (KM), environmental management, and education professional with over 30 years of experience in Australia and China. His work has received high-level acclaim and been recognised through a number of significant awards. He is currently a PhD candidate in the Knowledge, Technology and Innovation Group at Wageningen University and Research, and holds a Master of Environmental Management with Distinction. He is also the editor, lead writer, and a director of the award-winning RealKM Magazine (www.realkm.com), and teaches in the Beijing Foreign Studies University (BFSU) Certified High-school Program (CHP).

Related Articles

Back to top button