Human Factors Approaches and Models in LOC-I Accident Analysis and Prevention: Flight Crew Resource Management Techniques as a Risk Mitigation Tool

Human Factors Approaches and Models in LOC-I Accident Analysis and Prevention: Flight Crew Resource Management Techniques as a Risk Mitigation Tool

Ivan SikoraBenjamin L. Hari Moritz Hanusch 

University of West London, London W5 5RF, United Kingdom

Post Graduate MSc Air Safety Management, University of London, London EC1V 0HB, United Kingdom

Corresponding Author Email: 
ivan.sikora@uwl.ac.uk
Page: 
301-310
|
DOI: 
https://doi.org/10.18280/ijsse.100301
Received: 
30 April 2020
|
Revised: 
2 June 2020
|
Accepted: 
12 June 2020
|
Available online: 
30 June 2020
| Citation

© 2020 IIETA. This article is published by IIETA and is licensed under the CC BY 4.0 license (http://creativecommons.org/licenses/by/4.0/).

OPEN ACCESS

Abstract: 

Increased cockpit automation on modern jet aircraft aim to reduce the risk of Undesired Aircraft State (UAS) instances such as Loss of Control in Flight (LOC-I). Although LOC-I globally accounts for only 9% of all analysed accidents IATA has reported that it was responsible for 58% of all accident fatalities in 2017. The focus of this paper is to answer whether Threat and Error Management and Crew Resources Management (CRM) techniques are an efficient risk management tool when facing a LOC-I threat. Three LOC-I final aircraft accident reports were analysed to understand the structure of Human Factors (HF) during these flights. Methods from the HF field such as the Generic Error Modelling System (GEMS) and Skill-, Rule- and Knowledge-based (SRK) error approach provided invaluable insights to identify potential findings. A holistic investigation of cognitive structures in flight path management helped to visualise latent conditions and cognitively demanding tasks during LOC-I in routine operations. Bearing in mind the limited number of cases considered in this paper it should be considered as an overview in LOC-I accident analysis. It shows that leadership and teamwork, as essential aspects of CRM training, can serve as key strategies to mitigate HF problems and LOC-I risks.

Keywords: 

automation, human factors, manual flying skills, crew resource management, threat and error management

1. Introduction

In order to answer the research question of this paper whether TEM and CRM techniques on flight deck are an efficient risk management tool on the flight deck when facing a LOC-I threat, we sought for potential human factors findings in three LOC-I final aircraft accident reports [1-3]. These reports were chosen based on discussions amongst the three authors under the lead of Dr Ivan Sikora. Mr Moritz Hanusch and Mr Benjamin Hari are both active airline pilots for a major European Airline, additionally acting as CRM instructor and Human Factors Specialist, respectively.

Following intensive brainstorming in our team and discussions with subject matter experts from aviation industry, we have chosen the Generic Error Modelling System (GEMS) and its extension of the Skill-, Rule- and Knowledge-based (SRK) error approach as defined by Rasmussen [4]. As they appeared to be most in line with the findings obtained in the mentioned LOC-I accident reports, these approaches seemed suitable to form a baseline in our analysis.

In the three analysed LOC-I accidents, the aircraft was certified according to EASA Certification Specification for Large Aeroplanes (CS-25) specifications and met the required aerodynamic stability requirements [5]. However, in two occasions, improper inputs into the flight controls by the flight crew following an uncommanded autopilot disconnection led to a complex aeroplane upset (UAS), LOC-I and a fatal hull loss [1, 3]. On the other hand, through CRM and TEM techniques displayed by the flight deck team during an uncontained engine failure on Qantas Flight 32, the crew performed an emergency landing, and all occupants survived without injuries [2].

We aim to analyse the structure of human factors during these flights and whether CRM techniques are an effective risk mitigation tool to manage such complex situations.

The main challenge was to narrow down a large amount of information obtained from the aforementioned accident reports, official publications and from human factor researches. Therefore, this paper should be considered as an overview in LOC-I accident analysis other than an in-depth human factor study of each LOC-I accident. While the recent Boeing 737Max accidents represented LOC-I cases as well, we considered them to be beyond the scope of this paper, mainly due to the ongoing investigations as well as the distinct complexity of these cases. However, with the final accident investigation report on the Lion Air case [6] as well as the preliminary accident investigation report on the Ethiopian Airlines case [7] available, an analysis of these cases along CRM, TEM and Safety Management aspects would represent a preferable area of further research.

Following this introduction in the rest of Section 1, the paper introduces the term of Loss of Control – Inflight (1.1), presenting the magnitude of the challenge in recent years and highlighting the industry (1.2) and academic research initiatives (1.3). Next two sections (Section 2 and Section 3) focus on CRM techniques and current human factor researches to analyse how these risk management tools can support the flight crew in maintaining situational awareness on a highly automated flight deck. Section 4 extends CRM application to LOC-I situations management while Section 5 offers a further explanation on the task demand on flight crew during them. Penultimate Section 6 discusses ongoing challenges regarding the LOC-I, flight crew and their skills. Conclusions are addressed in the final section (7.0).

1.1 LOC-I accident analysis

In its dedicated “Loss of Control In-Flight Accident Analysis Report 2010-2014” [8], the International Air Transport Association (IATA) has defined LOC-I as follows: “LOC-I refers to accidents in which the flight crew was unable to maintain control of the aircraft in flight, resulting in an unrecoverable deviation from the intended flight path.” LOC-I situations often lead to aerodynamic stalls; they represent “[…] one of the most complex accident categories, involving numerous contributing factors that act individually or, more often, in combination” [8]. Unexpected degradations in automation often leave flight crews overwhelmed by the situation as they are struggling with system failure diagnosis while being forced to manually control the aircraft. According to research performed by Wilborn & Foster [9], most LOC-I situations develop within under 10 seconds. In line with these observations, a 2016 accident involving a Swedish Canadair Regional Jet [3] can be considered as exemplarily: only five seconds after first system anomalies were recorded, erroneous pilot inputs had led to negative g-loads. In turn, the resulting upset condition massively obstructed subsequent recovery attempts by the pilots.

1.2 Aviation industry initiatives

Following above mentioned high-profile LOC-I accidents (i.e. [1]), a considerable number of safety-driven efforts to manage the LOC-I risk by improving flight crew’s competencies in manual flight path control have been performed by the Aviation Industry in the past ten years since 2018. The International Civil Aviation Organization (ICAO) has identified LOC-I as one of three high-risk accident categories in its Safety Reports during the last years [10]. In its Safety Report 2017, IATA has pointed out that while LOC-I accidents accounted for only 9% of all analysed accidents, they were responsible for 58% of all accident fatalities in 2017 [11]. Moreover, taking into account a timeframe of ten years, the European Aviation Safety Agency (EASA) has identified “aircraft upset or loss of control” as the “[…] most common accident outcome for fatal accidents in CAT aeroplanes operations, accounting for 75% of them” [12]. A risk assessment performed by Hari in 2015 indicates that the probability of a fatal LOC-I accident is as low as 1×10-5 %, or one fatal accident out of five million flights. This confirms its risk nature: very remote but with high severity. Nevertheless, the very low probability does not justify the acceptance of the LOC-I risk [13]. In its latest Safety Report, IATA [11] confirmed that the entire industry has to continuously strive for a mitigation of the LOC-I risk as it is on the top end of both accident frequency and mortality risk statistics (cf. Figure 1).

In its recent Safety Reports, IATA detailed top contributing factors leading to LOC-I aircraft accidents (cf. [11]). Flight crew errors relating to manual handling and flight controls were a significant contributing factor in 35% of all LOC-I accidents in 2017, while other top contributing factors were SOP adherence and cross-verification, and pilot-to-pilot communication. In its extensive report on flight path management issues, the Flight Deck Automation Working Group [14] has identified several vulnerabilities in pilot skills for manual flight operations. IATA [8] added more specific insights regarding LOC-I accidents: “Human performance deficiencies, including improper, inadequate or absent training, automation and flight mode confusion, distraction the ‘startle’ factor and loss of situational awareness frequently compounded the initial upset and precluded an effective recovery until it was too late.“ [11] identified flight crew errors relating to manual handling and flight controls as a primary contributing factor in more than 60% of all accidents in 2017 (cf. Figure 2).

Analysing the rationale, in its Safety Report 2015, IATA states that “the generally high reliability and usefulness of automated systems pose the question of whether the high amount of flight hours spent in fully automated flight is responsible for pilots being increasingly reluctant to revert to manual flying skills when needed”. Furthermore, IATA concludes that “While aircraft are highly automated, the automation is not designed to recover an aircraft from all unusual attitudes. Therefore, flight crews must still be capable of manually operating the aircraft, especially in edge-of-the-envelope situations.” [15]. As a mitigation strategy for the near term, Jacobsen [16] has urged the industry to perform LOC training and to adopt better Standard Operating Procedures (SOP). IATA [17] added: “For the moment it seems that a well-trained pilot is still the best gadget on board an aircraft, to cope with the full range of possible situations whether foreseen or not. So, the question that needs to be asked is, regardless of the technology on your aircraft, what do you do to make your pilots more competent in upset prevention and recovery?” Since Upset Prevention and Recovery Training (UPRT) [18] has become an industry-standard, including mandatory simulator and CRM training elements for flight crews.

Figure 1. Accident category frequency and fatality risk (2013-2017) [11]

Figure 2. Accident primary contributing factors distribution – flight crew errors [11]

Figure 3. Relationship between TEM, the core competencies and UPRT [17]

1.3 Academic researches

Following initiatives by the aviation industry to address the risk of LOC-I, the Netherland Aerospace Center (NLR) published their report about manual flight path management of modern jet transport aircraft in January 2015 as part of the EU funded Man4Gen project. The contributors agreed that because of increased safety requirements and efficiency of commercial air transport operation, pilots’ flight path management tasks have transitioned from manually flying the aircraft by means of manual flight control inputs to programming a complex automation system and monitoring cockpit parameters during most phases of the flight [15, 19].

The results and conclusions obtained in the Man4Gen research project were implemented in the Safety Recommendation NETH-2014-005 (DSB), which was included in the Annual Safety Recommendations Review 2014 issued by the European Aviation Safety Agency. It is recommended to “review the applicable regulations on initial and recurrent flight crew training to assess whether they adequately address the potential degradation of situational awareness (basic pilot skills) and flight path management due to increased reliance on aircraft automation by flight crews.” [20].

EASA acknowledged the recommendation in their aforementioned Safety Review 2014:

“[…] the trend towards increased automation in aircraft design calls for a review of the rules to consider training on the potential degradation of situational awareness and flight path management due to increased reliance on automation by flight crews.” [20].

The next chapters will focus on CRM techniques and current human factor researches to analyse how these risk management tools can support the flight crew in maintaining situational awareness on a highly automated flight deck.

2. CRM Techniques to Manage the LOC-I Risk

2.1 Threat and Error Management (TEM)

Despite the good safety statistics in recent years, efforts to maintain this relatively low accident rate must continue. Based on the information given in Figure 3, correctly applied TEM techniques are the last line of defence when correcting an Undesired Aircraft State (UAS) following an uncommanded degradation of the automated flight control system. Those competencies are provided during simulator sessions and CRM courses, enabling the flight crews to build competences and skills to be effective risk managers on the flight deck and thus a reliable line of defence in the LOC-I threats cascade.

The Guidance Material provided by IATA states that TEM is the best countermeasure to manage a UAS. Because the number and conditions of possible causes of an aircraft upset are infinite, TEM gives the pilot the right set of competencies to manage an unforeseen event [18].

2.2 TEM and Evidence-Based Training (EBT)

It is of paramount importance to continuously implement results gained from the latest human factors research into Evidence-Based Training (EBT) for airline pilots, providing flight crews with a TEM tool to manage the complexity of LOC-I situations. They are the last line of defence when challenging or unexpected failures occur in the flight deck. During all phases of flight, such failures can degrade an aircraft’s automation and require the flight crew to take over manual aircraft control to prevent a UAS. Occurrences containing this level of complexity are extremely rare but may have fatal outcomes if the crew fails to correct the UAS [18, 21].

2.3 The limits of expertise in TEM

The deep and complex structure in task management on the flight deck can cause new threats [21]. As a result, highly skilled professionals can find themselves at the limit of their expertise, especially as interactions between cockpit automation and pilots’ behaviour are dynamic during situations of startle and surprise.

In the next chapter, a comprehensive human factors study describing the influencing factors on pilots’ situational awareness during the aforementioned situations aims to provide the reader with a more profound overview to understand the complexity of LOC-I.

3. Human Factors Approaches and Models

3.1 Flight path management in routine flight operation

The flight deck environment of a modern jet transport aircraft features a fly-by-wire flight control system and a fully automated flight envelope protection to keep the aircraft within safe aerodynamic limits during all flight phases. Notable modern jet aircraft currently in service are, i.e. Airbus A350, Boeing 777 and 787.

The human-machine system consists of two main elements which interact with each other throughout all phases of flight [22]:

Human

  • A team composed of qualified and licensed flight crew members.
  • Members are performing complex decision making and problem-solving.
  • Flight risk assessment and management is performed by applying TEM.

Machine

  • Combination of an automated system consisting of autopilot and autothrottle, which controls the flight path.
  • Automation processes flight information from various sources.
  • Inputs are processed and translated into physical manipulation of the aircraft flight path around its three axes.

With a higher surplus of cognitive resources available, the flight crew can increase the level of automation from manual flight to fully automated flight path management around the three axes. Modern flight deck automation is so reliable that the routine work of airline pilots is shifted towards higher order cognition like complex decision making, problem-solving or managing threats and errors. The automated system, which mainly consists of the autopilot and autothrottle, controls the flight path to relieve cognitive resources of its human operator, the flight crew. It is responsible for progressing flight information like vertical or lateral changes in the flight trajectory and the physical manipulation of the aircraft around its three axes with the aid of ailerons, elevator and rudder. The pilots are manipulating the flight path by selecting, i.e. a vertical speed and heading on the flight guidance panel or a new waypoint on the Multifunction Control Display Unit (MCDU). A manual input with the control yoke to direct the aircraft to the desired flight path is not required in the automated mode. Only its correct execution by the automation system must be monitored and acknowledged by the flight crew [15, 22].

3.2 Manual flight path management following uncommanded automation disconnection

In case of an uncommanded disconnection of automation, as happened in the three LOC-I accidents, the automation level reverts to manual, and the cognitive work distribution between pilots and the machine (automation) differs from the routine flight operation model:

Human

  • Physical motor skills control primary flight controls (aileron, elevator and rudder) using control yoke and rudder pedals.
  • Cognitive skills assess current aircraft condition, predict its future state and manage the flight path to satisfy navigational requirements.

Machine

  • Translates pilot’s manual flight path management inputs with the aid of control yoke and rudder pedals into physical manipulation of the aircraft around its three axes.

At all times, flight crews have the responsibility for progressing flight information and flight path management. Their physical motor skills for manual flight must be able to control aircraft’s flight path within safe aircraft and terrain limits. Also, cognitive skills, for instance, assessing current aircraft condition, predicting its future state, and manage the flight path to satisfy navigational and terrain clearance requirements have to be employed to assure the safety of the aircraft and its occupants [22].

4. CRM Techniques as a Tool to Manage LOC-I

4.1 Cognitive demand of flight crews in transient flight phases

The cognitive demand described in the previous chapter is not uniform across the flight profile but depends significantly on the current flight phase, the transient nature of the aircraft and environmental weather conditions. During straight and level flight, little cognitive effort in flight path monitoring is required when the aircraft is appropriately trimmed around its three axes. During departure, climb, approach and landing, however, much more vertical and lateral transitions in manual flight path management are needed to comply with terrain, and navigational requirements for Standard Instrument Departures (SID), Standard Arrival Routes (STAR) and Instrument Landing System (ILS) approaches. The higher cognitive involvement to monitor and predict the flight path and energy of the aircraft during those phases demands far more cognitive resources from the flight crew team [22]. Flying under degraded flight control protections after experiencing an autoflight system failure may place further demands on flight crews. Therefore, IATA recommends focussing on training scenarios with degraded modes during simulator sessions [15].

Airbus’ Statistical Analysis of Commercial Aviation Accidents [23] confirms Ebbatson’s argument: it indicates that a majority of all aircraft accidents occurred during the approach and landing phases (cf. Figure 4). Therefore, we conclude that errors are more likely during those phases of flight due to the increased workload, possible shortcomings in manual flying skill retention and the inherent complexity of managing the aircraft’s flight path around its three axes. During an error analysis following an automation degradation, the flight path monitoring tasks allocated to the flight crews may fall beyond their cognitive capabilities, individually and as a team [4, 24-26]. Dekker [27] described that issue as follows: “In real conditions under which people perform work, cognitive and resource limitations, as well as uncertainty and the sheer dynamics of unfolding situations all severely constrain the choices open to them.”

The unexpected requirement to manually control the aircraft in case of an uncommanded disconnection of the autopilot can thus exceed pilots’ capabilities to safely manage their aircraft’s flight path. In the 2009 accident of a Colgan Air aircraft in Buffalo, the sudden stick shaker activation and autopilot disconnection contributed to a surprise & startle effect that adversely affected the pilot flying’s response [28].

Figure 4. Accident by flight phase as a percentage of all Accidents 1998-2017 [23]

Figure 5. The relationship between conscious and automatic behaviour [29], adapted from [4] and [26]

5. Gems Modelling of Manual Flight Path Management

5.1 Skill, rule and knowledge task management classification

The Generic Error Modelling System (GEMS) is used to describe the deep structure of the limits in task management when the dynamic of a LOC-I demands exceptional cognitive resources from a team of highly skilled flight crew members. Various models are provided in science to analyse human errors on this level, but GEMS offers the most holistic approach to facilitate the understanding of flight path management challenges on modern jet aircraft. To facilitate understanding the task demand on flight crews during a LOC-I, we will use the SRK approach defined by Rasmussen [4] and Reason [26]. The degree of conscious control exercised by the flight crew over their activities is described by the SRK information processing model in Figure 5 above.

5.2 Multi pilot operation in knowledge-based behavior

During multi pilot operation in commercial air transport, the designated Pilot Flying (PF) is responsible for controlling the aircraft’s flight path (utilising manual aircraft control or using the autoflight system). The primary role of the Pilot Monitoring (PM) is to monitor PF’s activities and to communicate with Air Traffic Control (ATC). All pilots on the flight deck are expected to demonstrate a high level of monitoring- and cross-checking skills to obtain and maintain an accurate mutual mental of their present situation, especially concerning the aircraft’s flight path and energy state.

Figure 6. The dynamic model of GEMS (Adapted from [26], Highlights added by the Authors)

Based on the Dynamic Model of GEMS, depicted in Figure 6 above, it can be seen that TEM on the Knowledge-Based level requires a reversion to new mental models created in dynamic situations like a UAS. Although LOC-I situations are infrequent, the flight crew is faced with a highly complex scenario where no stored mental models are available anymore. Manual flying tasks have to be improvised because previously learned routine rules cannot be retrieved on the Knowledge-Based level. Additionally, sensory channels of humans are stimulated unilaterally on a modern flight deck. While the eyes and ears form the primary reception channel for the detection of the flight attitude, a stressful and emotional situation on a flight deck may result in the hearing sensory channel first to reach its limitation in perception. Therefore, it can easily become unreliable. The human perception system has a natural tendency to favour visual to auditory perception when the information presented is contradictory and conflicting. Therefore, e.g. an aural stall warning during the dynamic LOC-I may not be consciously perceived by the pilots [1, 30].

5.3 Leadership and teamwork

In those rare situations – at the limit of expertise of skilled professionals – exceptional leadership abilities and good teamwork are paramount to analyse the complexity of the developing UAS and to apply robust manual flying skills to manage the LOC-I risk. A notable example of a successfully managed situation that could otherwise have developed into a LOC-I scenario was Qantas flight 32 in 2010. Following an uncontained engine failure and a subsequent flight deck automation degradation after takeoff, the flight crew was faced with an extremely dynamic and rare cascade of failure messages but managed the LOC-I threats with an exceptional TEM on Knowledge-Based error management (more details are provided by Australian Transport Safety Bureau [2]). In a 2014 incident involving a Lufthansa Airbus A321 [31], the flight crew was faced with an uncommanded descent due to system malfunctions in two out of three redundant systems. Although no procedure was provided in the Quick Reference Handbook (QRH) for this kind of failure, the flight crew successfully regained control of their aircraft.

Monitoring is being triggered by the need to satisfy a decision requirement, in the context of this paper defined as monitoring the flight path and thus preventing a LOC-I accident. The execution of this task belongs to the group of monitoring goals and includes cross monitoring the other pilot’s actions, an accurate assessment of the present situation, and monitoring the energy state of the aircraft. However, with the words of Warm et al. [32], such “[…] vigilance tasks are exacting, capacity-draining assignments that are associated with considerable levels of stress in which the quality of performance efficiency wanes over time.”

To achieve the monitoring goal, the pilot has to activate the relevant monitoring tasks. They are residing within the brain’s long-term memory. The responses are carried out subconsciously on the Skill-Based level when these tasks are well-rehearsed and familiar to the pilot [33].

5.4 Situational awareness

The attitude indicator on the flight deck is in the focus of selective attention and stimulates the respective senses (visual and hearing) via the sensory stores. With the knowledge stored in the long-term memory (i.e. basic instrument scanning), the brain perceives the sensory responses within the short-term memory and interprets the context of the input. In the next step, the mental model associated with the system knowledge of the aircraft is compared against the expected outcomes stored within the working memory. By comparing the mental model with the actual mental picture, the situation awareness on the flight deck is updated, which in turn is the foundation for decisions [13, 33]. In a fully developed LOC-I or UAS however, the pilot has no reference available in long-term memory. A sudden reversion to Knowledge-based responses is required, when a skill-based reaction is called for [29].

5.5 Decision-making models

Even the most skilled and experienced flight crew may be prone to ignore feedback information that does not support their expectation during a complex automation degradation and hence opens opportunities for a further escalation during an UAS [26, 29], as demonstrated in the sequence of events that led to the Canadair accident in Northern Sweden [3]. Experiencing complex failures in the automation system during phases of flight with high cognitive involvement required by the flight crew, the task management during TEM reverts to the Knowledge-Based level if no learned rules to manage the LOC-I threat can be found. On this level of cognitive information processing, the pilots may be required to utilise their technical and operational knowledge. To assist pilots in deciding correctly in such a rare anomaly, airlines train their crews in decision making which represents a core element of Crew Resource Management (CRM) training [34]. Apart from other decision-making models (such as FORDEC, DODAR, DECIDE, or CLEAR), SPORDEC is a tool several airline operators use to enable their pilots to respond to any abnormal situation in a structured way [13, 35], see also Table 1.

The first priority before commencing any failure management is aircraft control. Accordingly, EASA [36] clearly stated that “[…] a core philosophy of ‘fly the aeroplane’ should permeate the automation policy prepared by air operators.” Based on the flight phase, the cognitive involvement of the PF may differ and depends on numerous external and internal factors. As mentioned earlier, LOC-I is a highly complex scenario that places high cognitive demands on a flight crew due to its rare nature. Therefore, it is vital to store a simple rule where pilots are only required to remember the first letters, thus helping them to manage an unforeseen anomaly in the cockpit in stressful situations. Preliminary actions (i.e. memory items) shall be executed to maintain essential control of the aircraft. This also includes collecting all data and facts by acknowledging and reading the messages displayed on the Electronic Centralized Aircraft Monitor (ECAM) or Engine Indication and Crew Alerting System (EICAS). A Quick Reference Handbook (QRH) has to be used if failures are not displayed or if the aircraft type does not offer electronic checklists.

Table 1. SPORDEC decision-model [35]

S

Situation Catch

P

Preliminary Actions

O

Options

R

Rating

D

Decision

E

Execution

C

Control

 

After all, options are discussed and assessed with their advantages and disadvantages by the flight crew, a decision will be made, resulting in the execution of this decision and finally the on-going quality control of this decision. The flight crew continuously reassesses the situation; if the decision has to be questioned due to new facts or an unforeseen development of the situation, the SPORDEC-process has to be initiated from the start [37].

6. Ongoing Challenges Regarding the LOC-I Risk

6.1 Limits of expertise in monitoring a reliable cockpit automation system

Complex LOC-I situations which require flight crews to perform error management on the Knowledge-Based level are rare nowadays, mainly due to the high reliability of cockpit automation and the strict application of CRM and TEM within flight deck work routines. To understand how the cognitive information processing of pilots is altered when monitoring a reliable automation system, Steven Casner, a research psychologist from NASA’s Ames Research Center in Moffett Field, California, performed a study with 16 airline pilots on a full flight simulator. During the experiment, the cognitive information processing in manual flying skill retention following unexpected automation failures was examined. Casner and his team concluded that basic instrument scanning and aircraft control skills are reasonably well retained when the cockpit automation is used. However, in line with Ebbatson et al. [38] observed that these skills could be subject to a certain degree of “rustiness” if not well maintained. They concluded that that the quality and accuracy in the retention of cognitive resources for manual flight path control depends on the degree to which flight crews remain actively engaged in the human-machine loop of supervising the cockpit automation. Following prolonged monitoring of a highly reliable automation system, ‘mind-wandering’ was observed. Additional practice during simulator sessions or actual flights may help to overcome the erosion of those vital cognitive skills. Also, further studies to analyse the effect of active monitoring on procedural manual flying skill retention are recommended [21, 25].

6.2 The impact of deficient manual flying skills

In their 2016 study “Flying the Needles: Flight Deck Automation Erodes Fine-Motor Flying Skills Among Airline Pilots”, Haslbeck and Hoermann stated that “hard and continuous drill is indispensable for pilots to acquire and maintain the adequate touch and feel essential to manually control the aircraft in any conceivable manoeuvre” [39]. However, in times of sophisticated automation, pilots’ opportunities to maintain their manual flying proficiency can be considerably limited. Many airlines even prohibit to disengage the Flight Director crossbars during manual flight, thus depriving flight crews of so-called “raw data” flying opportunities which allow to train “[…] all the complex mental calculations of pitch, power, and airspeed control required for flight path management of both vertical and lateral navigation” [25, 40].

The previously mentioned study performed by Ebbatson [22] revealed that the individual manual flying ability of pilots varied considerably. An explanation could be that “the level of recent exposure to manual flight may be dependent upon the pilot’s attitude to risk, how they perceive the benefits of manual flight and whether they take or seek opportunities to disengage the automatics” [22]. Ebbatson et al. [38] were able to determine several correlations between the recency of flying experience and pilots’ manual flying accuracy. Both studies mitigated factors that might falsify a coherent analysis of pilot control (such as transport delays, inertia, or the aerodynamic stability of aircraft) by measuring “inner-loop” (the pilot’s control intention) as well as “outer-loop” (the aircraft’s trajectory) parameters. According to Ebbatson [22], “pilots with more manual handling experience generally use less control input power to achieve equal levels of tracking performance.” This clearly supports the regular manual flying practice, as “proficient pilots need less effort to control their aircraft and to keep it within prescribed parameters” [41].

In the study performed by Haslbeck and Hoermann [39], 126 Airbus A320 and A340 pilots took part in a “raw data” simulator assessment that intended to reveal differences in regard to the time elapsed since initial flight training (using a comparison between captains and first officers) and differences in relation to flight practice (using a comparison between long-haul and short-haul pilots). The authors concluded that a lack of practice (that is clearly correlated with long-haul flying) expedites an erosion of manual flying skills. This effect was confirmed by a study performed by Hanusch [41] in which around 1,500 pilots had taken part in a comprehensive survey on their actual manual flying in line operations as well as the respective rationale behind. Long-haul pilots and pilots working for operators obliging rather restrictive procedural frameworks or imposing rigid company cultures were among those who most criticised to have insufficient opportunities to train their manual flying skills during line operations. In its extensive report “Operational Use of Flight Path Management Systems”, the Flight Deck Automation Working Group [14] warned that “(…) pilots who have not yet developed extensive manual flying skills may not get opportunities to practice and develop those skills, due to an increased emphasis on the use of automated systems.”

6.3 Counteracting measures

Ferris et al. [42] described a possible “deskilling” of pilots with the following words:

“[…] Over time, continued and extensive use of automation can lead to overreliance on technological assistance and the loss of psychomotor and cognitive skills required for manual flight […]. Deskilling can lead to a ‘vicious cycle’ of performance degradation when pilots’ realisation of their own skill loss leads to an even heavier reliance on automation.”

To counteract this issue, Transport Canada [43] stated that “pilots need to maintain manual flying skills to a high degree of proficiency and must develop confidence in their ability to do so.” In line with recommendations by Jacobsen [16], the Flight Deck Automation Working Group [14] and the Department of Transport [44], Haslbeck and Hoermann [39] added: “More manual flight practice could also be derived by changing companies’ automation policies to encourage pilots to fly manually if the situation permits.”

With the words of Bennett (2012, as cited by International Air Transport Association) [17]: “Malfunctions are to be expected in aircraft, by virtue of their interactive complexity, tight coupling and risk-and-error-prone operating environment. In the risk-laden world of aviation, the pilot is the last line of defence.”

Those pilots who have sufficient opportunities during training and line operations to build, maintain and improve their manual flying skills have a far more robust foundation when faced with complex scenarios like an impending LOC-I as they can revert to well-developed skills on the Rule-based level.

7. Conclusions

This paper demonstrated that a higher level of cockpit automation reliability influences basic manual flying skills due to the rare exposure of pilots. We reviewed recent aviation safety statistics and learned that the high complexity of LOC-I poses considerable demands on the cognitive information processing of airline pilots even when applying TEM standards and established decision-making models. CRM techniques are an efficient tool in managing complex UAS situations on the flight deck. Based on the findings obtained, the academic literature review revealed that the increasing level and amount of automation in modern flight decks have simplified flight path monitoring tasks in most situations. However, too much reliance on the cockpit automation system may have adverse effects on situational awareness when executing TEM techniques on flight deck. Leadership and teamwork as part of the CRM training for flight crews are mitigating the associated LOC-I risk. On the other hand, the high level of automation has created new opportunities for errors and mistakes to be made if pilots do not understand the current state of automation.

Finally, we agree to Casner et al. [25] that this vital relationship between the pilots and the reliable machine deserves further studies to maintain and improve the good safety statistics of recent years. Future academic studies in this field will support the aviation industry to maintain its high level of safety to provide passengers a safe and reliable air transport system.

Acknowledgment

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

  References

[1] Bureau d‘Enquêtes et d’Analyses. (2012). Final report on the accident on 1st June 2009 to the Airbus A330-203 registered F-GZCP operated by Air France flight AF 447 Rio de Janeiro–Paris. Paris: Bureau d‘Enquêtes et d’Analyses.

[2] Australian Transport Safety Bureau. (2013). In-flight uncontained engine failure Airbus A380-842, VH-OQA. Canberra: Australian Transport Safety Bureau.

[3] Statens Haverikommission. (2016). Final report RL 2016: 11e Accident in Oajevágge, Norrbotten County, Sweden on 8 January 2016 involving the aeroplane SE-DUX of the model CL-600-2B19, operated by West Atlantic Sweden AB. File no. L-01/16. Stockholm: Statens haverikommission.

[4] Rasmussen, J. (1982). Human errors - A taxonomy for describing human malfunction in industrial installations. Journal of Occupational Accidents, 4(2-4): 311-333. https://doi.org/10.1016/0376-6349(82)90041-4

[5] European Aviation Safety Agency. (2007). Certification specification for large aeroplanes CS-25. Cologne: European Aviation Safety Agency.

[6] Komite Nasional Keselamatan Transportasi. (2019). Aircraft accident investigation report, PT. Lion Mentari Airlines Boeing 737-8 (MAX); PK-LQP, Tanjung Karawang, West Java, Republic of Indonesia, 29 October 2018. Jakarta, Indonesia: Komite Nasional Keselamatan Transportasi.

[7] Aircraft Accident Investigation Bureau. (2019). Aircraft accident investigation preliminary report: Ethiopian airlines group B737-8 (MAX) registered ET-AVJ, 28 NM South East of Addis Ababa, Bole International Airport, March 10, 2019. Addis Ababa, Ethiopia: Federal Democratic Republic of Ethiopia, Ministry of Transport, Aircraft Accident Investigation Bureau.

[8] International Air Transport Association. (2015). Loss of control in-flight accident analysis report 2010-2014. Montreal-Geneva: International Air Transport Association.

[9] Wilborn, J.E., Foster, J.V. (2004). Defining commercial transport loss-of-control: A quantitative approach. AIAA Atmospheric Flight Mechanics Conference and Exhibit, 16 - 19 August 2004, Providence, Rhode Island. Reston, VA: American Institute of Aeronautics and Astronautics. http://dx.doi.org/10.2514/6.2004-4811

[10] International Civil Aviation Organization. (2018). ICAO safety report, 2018 edition. Montreal, Quebec, Canada: International Civil Aviation Organization.

[11] International Air Transport Association. (2018). Safety report 2017. 54th Edition issued in April 2018. Montreal-Geneva: International Air Transport Association.

[12] European Aviation Safety Agency. (2017). Annual safety review 2017. Cologne: European Aviation Safety Agency.

[13] Hari, B. (2015). Prevention of loss of control accidents in flight (LOC-I) through implementation of evidence based training in basic pilot training and type rating courses. London: City University London.

[14] Flight Deck Automation Working Group. (2013). Operational use of flight path management systems. Washington, DC: Performance-based operations Aviation Rulemaking Committee.

[15] International Air Transport Association. (2015). Safety report 2014. 51st Edition issued in April 2015. Montreal-Geneva: International Air Transport Association.

[16] Jacobsen, S. (2010). Aircraft loss of control study. AIAA GNC Conference, Toronto, Canada. Edwards, CA: NASA Dryden Flight Research Center.

[17] International Air Transport Association. (2015). Loss of control in-flight (LOC-I) prevention: beyond the control of pilots. 1st Edition. Montreal-Geneva: International Air Transport Association.

[18] International Air Transport Association. (2015). Guidance material and best practices for the implementation of upset prevention and recovery training. Montreal-Geneva: International Air Transport Association.

[19] Netherland Aerospace Center. (2015). Man4Gen: Manual operation of 4th generation airliners. Amsterdam: Netherland Aerospace Center.

[20] European Aviation Safety Agency. (2015). Annual safety review 2014. Cologne: European Aviation Safety Agency.

[21] Dismukes, R.K., Berman, B.A., Loukopoulos, L.D. (2007). The limits of expertise: Rethinking pilot error and the causes of airline accidents. Surrey: Ashgate Publishing Ltd. http://dx.doi.org/10.4324/9781315238654

[22] Ebbatson, M. (2009). The loss of manual flying skills in pilots of highly automated airliners. Cranfield, London: Cranfield University.

[23] Airbus. (2018). A statistical analysis of commercial aviation accidents 1958-2017. Blagnac Cedex, France: Airbus S.A.S.

[24] European Aviation Safety Agency. (2013). EASA automation policy. Cologne: European Aviation Safety Agency.

[25] Casner, S.M., Geven, R.W., Recker, M.P., Schooler, J.W. (2014). The retention of manual flying skills in the automated cockpit. Human Factors: The Journal of the Human Factors and Ergonomics Society, 56: 1506-1516. http://dx.doi.org/10.1177/0018720814535628

[26] Reason, J.T. (1990). Human Error. Cambridge: University Press. http://dx.doi.org/10.1017/CBO9781139062367

[27] Dekker, S.W.A. (2017). Rasmussen’s legacy and the long arm of rational choice. Applied Ergonomics, 59: 554-557. http://dx.doi.org/10.1016/j.apergo.2016.02.007

[28] National Transportation Safety Board. (2010). Aircraft accident report AAR-10/01– Loss of Control on approach, colgan AIR, Inc., Operating as Continental Connection Flight 3407, Bombardier DHC-8-400, N200WQ, Clarence Center, New York, February 12, 2009. Washington, DC: National Transportation Safety Board

[29] Embrey, D. (2005). Understanding human behaviour and error. Human Reliability Associates, 1: 1-10.

[30] Sinnett, S., Soto-Faraco, S., Spence, C. (2008). The co-occurrence of multisensory competition and facilitation. Acta Psychologica, 128(1): 153-161. http://dx.doi.org/10.1016/j.actpsy.2007.12.002

[31] Bundesstelle für Flugunfalluntersuchung. (2015). Interim report BFU 6X014-14. Braunschweig: BFU – Bundesstelle für Flugunfalluntersuchung / German Federal Bureau of Aircraft Accident Investigation

[32] Warm, J.S., Parasuraman, R., Matthews, G. (2008). Vigilance requires hard mental work and is stressful. Human Factors, 50(3): 433-441. http://dx.doi.org/10.1016/j.actpsy.2007.12.002

[33] Civil Aviation Authority. CAA. (2013). Monitoring matters. Guidance on the Development of Pilot Monitoring Skills. London: Civil Aviation Authority.

[34] European Aviation Safety Agency. (2014). Notice of proposed amendment 2014-17 – Crew resource management (CRM) training – RMT.0411 (OPS.094) — 26.6.2014. Cologne: European Aviation Safety Agency.

[35] Kunz, C. (2015). Resilienz von Piloten im Cockpit bei SWISS. Olten, Switzerland: Fachhochschule Nordwestschweiz (FHNW).

[36] European Aviation Safety Agency. EASA. (2015b). EASA Safety Information Bulletin SIB No. 2010-33R1: Automation Policy – Mode Awareness and Energy State Management. Cologne: European Aviation Safety Agency.

[37] Hari, B. (2013). Coursework A – Air Accident Investigation. London: City University London.

[38] Ebbatson, M., Harris, D., Huddlestone, J., Sears, R. (2010). The relationship between manual handling performance and recent flying experience in air transport pilots. Ergonomics, 53(2): 268-277. http://dx.doi.org/10.1080/00140130903342349

[39] Haslbeck, A., Hoermann, H.J. (2016). Flying the needles: Flight deck automation erodes fine-motor flying skills among airline pilots. Human Factors, 58(4): 533-545. http://dx.doi.org/10.1080/00140130903342349

[40] Landry, D.J. (2014). Keeping manual flying skills sharp. Air Line Pilot, June 2014, 24-25.

[41] Hanusch, M. (2017). Manual Flying Skills – Airline Procedures and Their Effect on Pilot Proficiency. London: City University of London.

[42] Ferris, T., Sarter, N., Wickens, C.D. (2010). Cockpit automation: Still struggling to catch up. In: Human Factors in Aviation, 2nd ed., Salas, E., Maurino, D.E. (eds.). Amsterdam; Boston, MA: Academic Press/Elsevier. http://dx.doi.org/10.1016/B978-0-12-374518-7.00001-8

[43] Transport Canada. (2015). Advisory circular AC 600-006 – Flight deck automation policy and manual flying in operations and training. Ottawa: Transport Canad.

[44] Department of Transport. (2016). Audit report - Enhanced FAA oversight could reduce hazards associated with increased use of flight deck automation. Report Number AV-2016-013. Washington, DC: Department of Transport, Office of Inspector General.