News story

The Challenge of Regulation in an Age of Advanced Automation

How much can a regulator understand the autonomous safety systems that transcend traditional levels of automation and what level of assurance can be provided?

The cockpit of a Royal Air Force A400M Atlas

The cockpit of a Royal Air Force A400M Atlas. MOD Crown Copyright.

The fatal crash in March 2019 of the Ethiopian Airways Boeing 737 Max 8 aircraft with the loss of all persons on board, which followed the crash of a similar aircraft type belonging to Lion Air in Indonesia in October 2018, has once again raised the question of automation in aviation.

Arguably, it has been a recurring issue throughout the history of powered flight. However, the pilot / computer interface has entered a new era and the most recent digital technology has created a different degree of complexity. Operators face the challenge of understanding the autonomous safety systems that transcend the traditional levels of automation.

It is a challenge for manufacturers, maintainers and pilots and, as highlighted in the two B737 Max accidents, it is becoming particularly acute for Regulators. How much does, and can, a Regulator understand such advanced systems? Therefore, what level of assurance can be provided?

Glass cockpits and flight management systems have been around for many years. However, they have evolved from merely presenting analogue information in a ‘clean’ and easily assimilated way, to processing such information - manipulating it and relaying instructions directly to the aircraft control surfaces and engines.

Pilots and technicians are no longer being asked if they wish to effect a change but increasingly being told. According to the US Airline Pilots Association, one of the most common phrases heard on a modern flight deck is “what is it doing now?”

This development is becoming increasingly common in all modern aircraft, be it the B737 Max, the A400M or the F-35 Lightning. But to understand the evolution of automation we need to go back to first principles. The main role of a pilot is to manoeuvre the aircraft from A to B safely, efficiently and effectively. There are four component tasks for pilots to perform:

  • to operate, manage and monitor the engines and control systems
  • to avoid inadvertent encounters with other aircraft, unfriendly terrain or objects on the ground
  • to navigate efficiently to the destination / objective
  • to communicate with operators (the airline company in civil aviation and the supported formation in military aviation) and air traffic control

Manual flying, especially on long sorties or in high workload or high threat environments, is a fatigue inducing activity. Therefore, automation was introduced as a tool to help the pilot fulfil their role and generate additional capacity. Early forms of automation such autopilots and auto-throttles, made it much easier to fly smoothly, efficiently and with precision. This did not alter the pilot’s role in securing the aircraft’s safe journey but, in recent years, the tasks required of a pilot have changed considerably.

With advanced technology it is possible to completely automate the aeroplane and to have a pilot-less aircraft. This has not happened in civil aviation, primarily because of socio-political reasons as passengers do not (yet) regard it as acceptable to place their lives entirely in the hands of an autonomous or remotely controlled machine. Up to this point, there has always been an acceptance of the requirement for a human to be physically present in the cockpit, at the pilot or computer interface. Critically with the ability to manually override the automation system.

However, as the B737 accidents have highlighted, automation is now so fundamental to the operation of an aircraft that even a human in the cockpit cannot always override the automated system.

Automation has evolved through three stages:

  • control automation; which assists the pilot in control tasks or substitutes the pilot’s manual manipulation with the automatics
  • information automation; namely the display and avionics that are devoted to navigation and environmental surveillance and to the digital communications
  • management automation; which includes all those things that aid the pilot in the management of the mission. Management automation guides the airplane, performs the necessary flight functions and furnishes the pilot with information concerning both the state of the airplane and of progress toward the sortie’s goal

Cockpit automation has had many positive effects on flight safety. It has increased technical safety and effectiveness and decreased costs. However, it can also have some negative effects. In a high automation aircraft, after long hours in the cockpit, it becomes possible for the pilots to lose situational awareness due to hyper-vigilance which is a narrow focus on the mission management systems.

Computers and decision support systems were put into the cockpit to decrease human errors, but users can perceive these systems as wiser and superior. Pilots in need of rapid decision-making can place undue trust on automation without considering all the data. This is known as automation bias.

It is also possible to categorize errors by their source. An automation error can occur due to a failure of the system or a function. Improper and incorrect programming of the system done by the user, or a false input to the system, can cause automation error. Other causes of errors may be inadequate training and unsuitable operating procedures. Significantly, errors made at the stages of designing and programming of software systems can cause automation errors.

The Boeing philosophy is that automation is there to assist and not replace the pilot. The pilot is the final authority in flight operations. New technology should be used only when the outcome is clear and where the outcome provides efficiency advantage and has no adverse effect on the human-machine interface. It will be interesting to see whether the reports into the B737 Max accidents indicate that this philosophy was adhered to by Boeing. Even in an aircraft which, due to its a design age, is still mostly flown by conventional means.

It is important to remember that this type of incident is not new. It was evident 25 years ago when an Airbus A310 (operated by Aeroflot) crashed into a mountain range in Kemerovo Oblast, Russia in 1994, killing all on board. The human factors contribution to the accident was the error made by the captain in letting his children sit on the flight deck during flight.

However, it was the pre-programmed actions of the automated systems and the pilot’s misunderstanding of what the systems were trying to do, that were the causes of the accident. One of the children applied enough force to the control column for 30 seconds to contradict the autopilot input. This caused the flight computer to switch the plane’s ailerons to manual control while maintaining control over the other flight systems. A ‘silent’ indicator light came on to alert the pilots to this partial disengagement. The autopilot, which no longer controlled the ailerons, used its other controls to compensate, pitching the nose up and increasing thrust. As a result, the plane began to stall.

The autopilot, unable to cope, disengaged completely. A second, larger indicator light came on to alert the pilots to the complete disengagement and this time they did notice it. At the same time, the autopilot’s display screen went blank. To initiate a recovery from the stall, an automatic system lowered the nose and put the plane into a nosedive. Despite the struggles of both pilots to save the aircraft, it crashed.

It was concluded later that if they had just let go of the control column and re-engaged the autopilot, it would have automatically taken action to prevent the stall. Thus, avoiding the accident. The pilots, who had previously flown Russian designed planes that had audible warning signals, apparently failed to notice the initial warning. But it was the difficulty in understanding the automated actions programmed into the system and how to respond correctly that was considered their single largest challenge.

The proliferation of control modes in the newest systems is a particular cause for concern. Especially when the aircrew are monitoring the performance of the management automation system. To improve stimulation some pilots have wanted to remove part of the automation and utilise the remaining features but are unable to do so because ‘all or nothing’ is the only option. This is likely to be a computer programming issue because programmers cannot know every intricacy of aircraft operation in the real world.

In the ‘all’ scenario, this issue is amplified by over-reliance upon automation by the pilot. It is also reflected in the pilot’s reliance on the system to automatically make the correct response during abnormal operations.

Human Factor experts report that highly skilled personnel do not always do a good job of monitoring indications for events that have a very low probability of occurring. Those who maximise the use of technologically possible automation often miss this critical point. Automated systems are fallible, and pilots are the last line of defence capable of controlling, or in some cases preventing, a system failure. Pilots must be able to monitor the system effectively and understand how the system is planning to accomplish its task. An automated system can only be monitored effectively if it is predictable. Pilots must be trained for the normal operation of each automated unit as well as its behaviour during any failure modes, so they can make manual corrections or stop the automation failure going further.

Therefore, the performance of the computers and the human operators must be monitored by each other. For example, the computers should be able to send warning signals when a human operator has made an error and, at the same time, when the automation is making incorrect decisions humans need to understand and be aware of it. Simplifying the system will allow for unintentional errors to be detected and the appropriate action be taken to correct the error. But, we may have reached the point where automation is so advanced that the pilot may no longer be able to take full responsibility for all aspects of flight safety.

According to Clint Balog, a former test pilot who researches human performance, cognition and error at Embry-Riddle Aeronautical University, the industry is aware and has been for decades, that the introduction of automation is a double-edged sword. He acknowledges that it has had tremendous benefits to aviation safety but there are also potential downsides. As the technology gets ever more complex, it is possible that we are reaching the point where all the realisable benefits have been achieved and are entering an era when that very same technology is creating risk rather than reducing it.

The preliminary report from Indonesian investigators indicates that the Lion Air B737 Max 8 crashed because a faulty sensor erroneously reported that the aircraft was stalling. The false report triggered the automated system known as Manoeuvring Characteristics Augmentation System (MCAS). This system tried to point the aircraft’s nose down so that it could gain enough speed to fly safely. A warning light that would have alerted the crew to this faulty sensor before take-off wasn’t part of the optional package of equipment on Lion Air’s B737 Max 8 aircraft.

So not only are the technical complexities increasing, the side of the equation where the aircraft provides an alert to the pilot was not regarded as being a ‘core’ safety function. This is a very disconcerting trend.

According to Balog, there are two intractable problems in existing aviation technology: “automation transparency” and “automation complacency.” With automation transparency, airlines and manufacturers struggle to properly educate pilots and engineers on the latest automated systems. This problem gets worse as the technology gets more advanced. When they eventually catch up, engineers tend to accept self-diagnosis of automated systems and pilots are found to overuse automated systems. This leads to complacency and a degradation of manual flying skills.

As the F-35 Lightning Force Commander recently observed,

as a pilot you sit there with an ‘iPad’ in front of you, that is how you access the aircraft’s capabilities.

But how much understanding is there of the app behind the display screen?

Human Factors remains the largest cause of aviation accidents, but ordinarily such factors are limited to one individual affecting the safety of one aircraft. The risk is largely contained to a single flight. There is now an added dimension, in the latest airliners and 5th generation combat aircraft, one software error can be uploaded to the entire fleet of aircraft in an instant.

The challenge for the Regulator is twofold: to understand the technical complexity of the latest automated systems; and then to assure that the aircraft is both safe to operate in all flight profiles and can be operated safely. Embedding Regulators with designers and manufacturers, as with the case of the Federal Aviation Authority (FAA) and Boeing, may be one option, but it runs the risk that the Regulator becomes too close and loses objectivity. At the same time, remaining largely remote from industry means that there might just be too much of a gap in technological understanding. There are no easy answers; rather there is significant scope for new thinking.

Acknowledgements:

ORLADY, H.& ORLADY, L. (1999). Human Factors in Multi-Crew Flight Operations. Brookfield: Ashgate Publishing Ltd.

Diyar Akay, Ergün Eraslan and Cengiz Yoldaş (2007). The Impact of Automation and FMS in Flight Safety: Results of a Survey and an Experimental Study.

Andrew J. Hawkins (Mar 19). Automation is a Double Edge Sword. The Verge.com

Published 8 August 2019