AviatAR - An Augmented Reality System to Improve Pilot Performance for Unmanned Aerial Systems

Doctoral Candidate Name: 
Robert Louis Abbott
Program: 
Computing and Information Systems
Abstract: 

In the modern airspace, small unmanned aircraft systems (sUAS) such as multi-rotor aircraft, commonly referred to as "drones", are becoming increasingly popular with both amateur enthusiasts as well as professional pilots. In recognition of the necessity to integrate sUAS traffic into the national airspace system, Congress passed the FAA Modernization and Reform Act of 2012, which created the mandate for the FAA to regulate sUAS operation in United States national airspace. This legislation also created a number of obligations and duties for UAS pilots, including avoidance of restricted airspace, maximum flight levels, safe separation from aircraft (including other UAS), as well as avoiding flight over civilian human population and contact with personal property such as buildings or cars. Because of the nature of flying a drone either for pleasure or commercial purpose, it is very easy for operators to lose their situational awareness (SA) of the operating environment. A study published by the NASA Langley Research Center in 2017 found that the majority of commercial aviation accidents not attributable to aircraft systems failure involved the crew’s loss of SA of the aircraft or the environment, and that crew distraction from operation was associated with all of these accidents. If this is the case with commercial aircraft pilots inside of an enclosed aircraft cockpit in relative isolation, it is easy to imagine that the potential for distraction in the UAS environment is at least as great. This demonstrates the potential for a decreased SA state to create an unsafe environment for other pilots and bystanders and lead to fines and penalties for the UAS pilot if damage, injury, or disruption to the airspace occurs.

One mode of pathological flight phenomena in fixed-wing aircraft is that of pilot-induced oscillations (PIOs). These PIOs can occur either as a result of pilot-airframe coupling as in the case of biodynamic feedthrough or as a result of the lag between pilot observation and action and the propagation of the pilot’s actions and the control response of the aircraft under the influence of structural or environmental stimulus on the aircraft system. Under either scenario, the actions necessary to identify and resolve PIOs can quickly distract the pilot and cause a degradation of pilot SA level. This pilot distraction can lead to mission task element (MTE) failure, loss of aircraft control, or damage or destruction of the aircraft and surrounding persons and property. In the case of UAS, some PIOs can be induced as a result of a lack of direct tactile feedback and neurosensory coupling between the remote pilot and the aircraft. While some of these effects can be mitigated with the addition of haptics in the control actuators or through the use of first-person view monitor goggles, increased distance between the remote pilot and the UAS reduces the ability for the remote pilot to judge the effects of fine control inputs on UAS attitude. This can lead to the development of PIOs as the remote pilot attempts to control the UAS from a distance. Long-range, beyond line-of-sight missions rely upon autonomous flight control system to guide the UAS using global navigation satellite systems (GNSS) or more complex navigational methods such as inertial guidance, celestial navigation, or terrain-matching in communications-denied environments; however, these autonomous methods do not work well for primarily human remote pilot operations with augmented control applications such as bridge or communications tower inspection, where the UAS must be guided from a distance by a human pilot while focusing on specific tasks identified during the mission. To better enable an individual UAS operator to carry out complex mission task elements, we sought to develop a head-mounted display (HMD) equipped with see-through augmented reality (AR) capabilities. The objective of this HMD is to provide information to the pilot using low-complexity visual cues with sufficient information capacity to help improve mission performance and maintain pilot SA while minimally increasing or even decreasing pilot cognitive processing workload. We refer to this system as AviatAR. The contributions of this research include: new and additional insights into the development of PIO phenomena during rotary-wing UAS operation, the detection of PIO development in real-time during flight operation of rotary-wing UAS, and comparisons of the effects of communication of visual flight information to a pilot through the primary and peripheral visual fields using a see-through AR headset.

Defense Date and Time: 
Friday, April 23, 2021 - 1:00pm
Defense Location: 
Zoom Link: https://uncc.zoom.us/j/91037180316
Committee Chair's Name: 
Dr. Mirsad Hadzikadic
Committee Members: 
Dr. Minwoo Lee, Dr. Zbigniew Ras, Dr. Samira Shaikh