Since the first generation of highly automated “glass cockpit” commercial aircraft entered service in the late 1980 s,the rate of flight accidents has decreased year by year,and the flight safety has been significantly improved.In recent years,with the rapid progress of AI technology,the degree of automation of cockpit automations will continue to increase.The improvement of the degree of automation changes the pilots’ responsibility from “control-navigate-communicate” to focusing on monitoring the automation system,which may lead to new and unexpected error modes,and further induce some aviation incidents and accidents.These incidents and accidents warn us to pay attention to the safety risk that automation may cause pilot errors.In Study 1,a semi-structured in-depth interview was conducted with 14 pilots from five domestic airlines to identify the psychological and behavioral factors induced by automation that may improve the probability of human error.The results show that monitoring failure and automation surprises are the core problems found by pilots in the process of interaction with automation.Based on the findings of Study 1,Study 2 further explored the problem of pilot decreased monitoring failure/consciousness(mind wandering)caused by high degree of automation.The study sampled the thoughts of 34 pilots using different degrees of automation in a L3-A320 flight simulator.The results showed that the higher the degree of automation,the more likely the pilots were to have task-irrelevant thought,namely,to fall into mind wandering.While in the medium(the third)degree of automation,task related advanced thought activities are the most.In Study 3,data mining was conducted on ASRS database to explore the automation surprises found in Study 1.After screening,a total of 281 cases related to automation surprises were included in the analysis.The results show that automation surprises rarely have disastrous consequences;The higher the degree of automation,the greater the probability of automation surprises;In addition to system faults and manual operation errors,the design of automation system also has an important impact on automation surprises.Based on the analysis of the causes of automation surprises in Study 3,for two Boeing’s B737 MAX accidents,poorly designed MCAS performed out of the behavior of the pilot mental models is the direct factor causing accidents.In study 4,the FLAP model was used to systematically analyze the causes of poor MCAS design.The results of the study found that Boeing and FAA were ineluctable.On the basis of the above studies,we can identify some pilot errors caused by automation system at the present stage.This is likely to lead to the pilot to doubt automation systems,and the trust is reduced.With AI enabled,automated systems will continue to become more autonomous,making it harder for pilots to predict their behavior.Therefore,man-machine trust will gradually become an important research topic during pilot interaction with autonomous system.In order to provide a framework for future pilot-autonomous interaction,Study 5 constructed a conceptual model of trust in pilotautomation teaming based on researches on automation trust.To sum up,it is hoped that the series of findings in this paper can provide valuable help for aircraft automation system design,pilot training and the formulation of policies and regulations related to aviation operation. |