Combating Human Factors Challenges in Unmanned Aerial Systems (UAS) Cockpits
The introduction of Unmanned Aerial Systems (UAS) into the airspace has revolutionized the world that we fly in today. With systems like the RQ-4 Global Hawk in use and tasked with the mission of high-altitude, long endurance, reconnaissance, pilots, from a remotely operated mission control element (MCE) can acquire near-real-time, high resolution imagery of large geographical areas 24 hours a day and in all types of weather conditions (Northrop Grumman, 2014). While this doesn’t quite render manned airframes such as the U-2 Dragon lady obsolete, because of the U-2’s ability to fly at 70,000 feet vice 55,000 feet (Lockheed Martin, 2014). UAS still give us an incredible advantage on the battlefield—risking only airframes over human operators in missions. This begs the question, however – does removing the pilot from the aircraft pose new, unexplored issues to the aviation realm? Additionally, how do issues such as data latency and the complete absence of haptic feedback affect the pilots’ perception of the dynamic environment their airframe is operating in? Is there anything that can be done on the design end to reduce these adverse effects? This paper will seek to answer these questions, as well as discuss the human factors involved in traditional flight, and compare these to human factors issues faced in remote aircraft operation.
Summary
This paper will explore the human factors challenges associated with unmanned aerial system usage, to include, but not limited to, pilot performance in absence of haptic feedback and data latency and air traffic controller performance – what additional challenges do UAS bring to light from an airspace management perspective? Furthermore, from a psychological perspective, does the remote operation of aircraft serve as a breeding ground for pilot and sensor operator complacency? Moreover, does the complete removal of the crew from the air completely rob them of situational awareness and cognizance of their surroundings? From a sensation and perception standpoint, are there certain aspects of UAS design that need to be taken into account from the very beginning?
With regards to the ASCI 530 learning outcomes, several of them will be addressed here to include:
1.) Evaluate the challenges associated with the integration of UAS into the National Airspace System (NAS) and analyze the role UAS can play in overcoming those challenges.
2.) Evaluate the differences in unmanned systems based upon their varying missions.
3.) Evaluate the history and current state of the design and architecture of unmanned systems.
4.) Analyze and explain functional requirements and capabilities of major unmanned systems, considering cost and weight estimation, basic aircraft performance, safety and reliability, lifecycle topics, vehicle subsystems, and system integration.
5.) Evaluate, compare and contrast unmanned systems with comparable counterpart manned aircraft systems in regard to design, development, and operation.
Human Factors and Situational Awareness
Before launching into an aviation human factors discussion, it is important to discuss human factors in general first. The Human Factors and Ergonomics Society (HFES) eloquently define human factors as a subject concerned with “the application of what we know about people, their abilities, characteristics, and limitations to the design of equipment they use, environments in which they function, and jobs they perform” (Human Factors and Ergonomics Society, 2014). This definition could not be more applicable to pilots interacting with remotely operated air vehicles. It is of utmost importance that human factors engineers take what is publicly available in research about pilots and pilot candidates, their abilities, and most importantly their limitations, in system design. Ultimately, it is their limitations that will dictate the success of the system; failure to design an unmanned aerial vehicle within the constraints of the human operator will result in aviation mishaps.
Issue/Problem Statement
Unmanned Aerial Systems have revolutionized the world we operate in today. By allowing pilots to remotely control an aircraft from a safe Contiguous United States (CONUS) location or Outside Contiguous United States (OCONUS) location, we can effectively remove the human operator from the threat environment. However, while the pilot is removed from the threat environment, a whole array of human factors issues present themselves. There is typically a two second lag time between operator input and response in US Air Force remotely piloted drones (Bidwell, J., Holloway, A., & Davidoff, S., 2011). This lag time is different from the instantaneous control response that traditional pilots experience. Additionally, UAS pilots are forced to fly in absence of haptic feedback. Haptics, simply put is our touch sense, with haptic feedback more specifically as the feedback we receive from our tactile receptors (MacLean, 2008). Why is haptic feedback an issue of importance in the flying realm? Pilots are tasked to rely on all of their senses in flight to detect any changes in conditions that may lead to a mishap. Lack of haptic feedback in remotely operated airframes renders them void of all of the vibrations that occur as a result of wind buffets, potential bird strikes, etc. These vital pieces of feedback provide conventional pilots a good deal of information about the environment they are flying in and this lack of haptic feedback in UAS operations is something worth designing for in future UAS development. Overall, haptic feedback will improve the level of safety in UAS operation and also help with increased efficiency and pilot and sensor operator performance in degraded visual environments (Lam, 2009).
Situational Awareness
Another topic worthy of discussion before delving the questions posed at the paper is the topic of situational awareness. Micah Endsley defines situational awareness as, “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future” (Wickens, 2008). He breaks situational awareness up into three levels – perception, comprehension and projection. Simply put, this means properly understanding the current state of the environment and taking those cues and being able to project them to the future status of the environment. The example given in a 2008 Wickens articles applies this theory to an air traffic controller. The air traffic controller will first notice a change in trajectory (potentially indicating a conflict alert), comprehend this to mean that two aircraft are on a converging trajectory, and then project this status to the future – the two planes colliding if one of the pilots-in-command fails to change their path (Wickens, 2008).
To tie this situational awareness discussion back to one of the questions posed at the beginning, “does the complete removal of the crew from the air completely rob them of situational awareness and cognizance of their surroundings?” I would argue that it does – the definition of situational awareness specifically mentions “the perception of the elements in the environment within a volume of time and space.” Remotely operating a UAV from a geographically-separated GCS completely negates the environment portion of this definition as the pilot and crew are not directly in the flight environment.
Multiple Resources Theory
Multitasking is wide-spread in our society today – we would be pressed to go a grocery store without seeing someone walking through the aisles on their smartphone, or someone walking to the subway without a smartphone glued to their hand. In his article titled, “Multiple Resources and Mental Workload,” Christopher Wickens seeks to understand the extent to which dual-task performance leads to a decrement in time-sharing ability (Wickens, 2008). He attempts to do this through the multiple resource theory. He breaks up the multiple resource architecture into three main components – demand, resource overlap, and allocation policy (Wickens, 2008). Demand can be conceptually broken up into two regions of task demand – the first, is one in which demand is less than the capacity of resources available; this is ideal because it means that the individual, or the pilot in the UAS Ground Control Station (GCS) has additional mental resources to use in the event that an unexpected event in flight occurs. The second region is where demand exceeds capacity, and this is the less-desirable situation because, as with any economic system, when demand is greater than capacity, performance unfortunately breaks down (Navon & Gopher, 1979). The distinction between these two demand regions is typically referred to as the “red line” of workload (Grier, 2008).
In my opinion, it is of utmost importance for UAS designers to identify the red line with regards to workload for the human operator and to take that into heavy consideration for subsequent system design. By identifying the “red line” with regards to workload and mental capacity for UAS pilots, we can ensure that once they reach this point, there are adequate assistive resources in the cockpit to alleviate some of the workload from the human operator, and bring them back down to that first region of demand—one in which the demand is less than the capacity of resources available. UAS operators are constantly fighting a battle against mental workload – they are faced with the conventional challenges of a cockpit: the various visual displays and auditory warnings as well as the unconventional challenges of remotely operating an aircraft in the absence of environmental cues and lack of instantaneous feedback.
A 2009 essay titled “Haptic Interface for UAS Teleoperation,” discusses how the physical separation between the human operator and vehicle leads to a decrease in situational awareness. It goes on to discuss one of the major differences between manned vs. unmanned airframes – how the unmanned aerial vehicle pilot “lacks various multiple-sensory information such as sound, motions, and vibrations of the airframe.” Furthermore, the author discusses how the operator is only provided with visual information from camera images with limited resolution and field of view. Additionally, the camera may not always be pointed in the direction of motion – the UAV could be traveling North, but the camera could be oriented towards an enemy hot spot towards the Southwest. This is a situation unique to UAV operation, and once again, needs to be looked at for improvement. Moreover, he links the lowered situational awareness (due to the degraded camera imagery/field of view) to reduced safety in operation and high operator workload (Lam, 2009).
Significance of Problem
The human factors issues inherent in UAS usage are significant because of the growing use of UAS in airspace operation. Some government and industry players believe that UAS will begin commercial civil operations in the next 5-10 years. They also believe that UAS operations will mirror the same “oceanic, en route and terminal airspace where air carriers operate” (Airline Pilots Association, 2011). The Department of Defense also foresees a greater application of UAS in the future and is currently conducting near-term, mid-term and far-term activities to integrate UAS into the National Airspace System. By 2015, the Joint UAS Center of Excellence (JUAS COE) estimates that the DoD will have 197 units based at 105 locations—a staggering 35% increase in units and 67% increase in UAS operational locations (Department of Defense, 2011). With several agencies pushing for widespread use of UAS in the upcoming years, it is of utmost importance that developers continually work to improve all parts of the existing UAS architecture, to include the human factors challenges.
Alternative Actions
With regards to alternative actions, in my opinion, piloted airframes should only be used in lieu of unmanned aircraft in missions that may require a pilot to potentially navigate out of a dangerous situation where passenger/crew life is threatened – example, a C-17 flight across the Pacific Ocean with 60 troops on board; this is a situation that would warrant an actual pilot onboard vice a pilot controlling the airframe from a geographically separated location. This allows the pilot in command the ability to sense any environmental cues indicating an issue during flight (smoke, bird strikes, etc). Additionally, from a commercial airline safety perspective, in the interest of passenger safety, it makes most sense to have a pilot on-board to deter hijacking attempts. September 11, 2001 is a day that no American will ever forget. On that fateful day, Al Qaeda operatives hi-jacked four commercial airliners, flying them into both World Trade Center towers, the Pentagon, and a field in Pennsylvania (it is suspected that this plane was intended to be crashed into either the White House or Capitol Building). The passengers on board the fourth plane (which tragically ended up crashing in a field in Pennsylvania) attempted to regain control of the aircraft from the hijackers (Schlossberg, T. & Santora, M., 2014). Sadly, while the unarmed pilots on board were unable to overtake the hijackers and navigate their plane to safety, they still presented an additional challenge on board – just the mere presence of a pilot on-board versus a plane with no pilot on board adds a new layer of difficulty for would-be hijackers on board commercial flights.
Recommendations
One recommendation that I would have would be for UAS developers would be to look into the feasibility of incorporating vibrotactile “on-thigh” alerts in ground control stations. A 2011 Human Factors and Ergonomics Society article discusses this possibility. The goals of the study were to “explore the appropriateness of the thigh as a placement to convey vertical directional cues to a seated operator, to assess the thigh as a potential locus for directional orienting, and to assess the effect of the message expressed by the vibrotactile cue (fight or flight) on performance” (Salzer, Y., Oron-Gilad, T., Ronen, A. & Parmet, Y., 2011). The authors hypothesized that adding vibrotactile cues to existing anti-collision alert and auditory cues would improve performance in directional decision tasks. The researchers found that vibrotactile cues oriented towards the response direction were preferred to visual cues that pointed to the direction of the hazard to avoid (Salzer, et al. 2011). Additionally, they found that the auditory cues in the study (used in current anti-collision alert systems) showed little to no benefit in the experimental design of this study. This was mostly attributed to the fact that auditory alerts tend to take a few seconds before saying the word “collision” – this lag time slows a pilots’ reaction time (Salzer, et al. 2011). In summary, the researchers believed that they developed a good foundation for future development of on-thigh vibrotactile displays to convey direction in the vertical plane. They recommended that future studies look at two tactile display modes, coupled with a greater visual load (to simulate a true flight environment) as well as the use of a common control (like a joystick) that would add more meaning to the vibrotactile cue (i.e., vibrotactile cue indicates a collision to the right, pilot can instinctively move the joystick to the left to avoid it) (Salzer, et al. 2011).
Another recommendation that I have to mitigate human factors issues in UAS cockpits would be to place more emphasis in crew selection. The UAS cockpit is known to be a taxing work environment, regardless of the pilot and sensor operator working in it. However, certain individuals are predisposed to do better in a task-saturated environment such as a UAS cockpit (or Ground Control Station) than others. My recommendation is to raise the bar for UAS crew screening programs to ensure the most suitable individuals are selected for the job. A 2006 McCarley and Wickens article suggests further research on techniques to understand and facilitate crew communications, with a particular emphasis on inter-crew coordination during the hand off of UAV control form one team of operators to another (McCarley, J.S. & Wickens, C.D., 2006). A real-world example of this is a UAS crew based out of a Ground Control Station at Creech AFB, Nevada handing over control to a crew based out of a Ground Control Station at Kandahar Airfield, Afghanistan.
I would not recommend cleansing the US Air Force inventory of conventional manned reconnaissance planes such as the U-2 Dragon Lady. A September 2014 Air Force Times article discusses the Air Force’s decision to start retiring the U-2 fleet starting in 2015. The 2015 fiscal budget calls for retiring the U-2 fleet and purchasing RQ-4 Global Hawks to assume the high-altitude, intelligence, surveillance and reconnaissance. The service was drawn towards the Global Hawk in-lieu of the aging Dragon Lady due to the reduced sustainment costs for the former. The Air Force projects spending $487 million dollars for the development and installation of a “universal payload adapter,” essentially allowing the sensor equipment and the cameras from the U-2 to be retrofitted onto Global Hawks. (Everstine, 2014). General Michael Hostage, the commander of Air Combat Command, which is the major Air Force Combatant Command to use the Global Hawk in operations, is well aware of the shortcomings of the RQ-4 in comparison to the U-2. He is quoted as saying, “the Global Hawk does not have the same capability as the U-2, which will mean that the military will suffer once the Dragon Lady is retired” (Everstine, 2014).
References
Airline Pilots Association (2011). Unmanned Aircraft Systems: Challenges for Safely Operating in the National Airspace System. Retrieved from http://www.alpa.org.
Bidwell, J., Holloway, A., & Davidoff, S. (2014). “Measuring operator anticipatory inputs in response to time-delay for teleoperated human-robot interfaces.” Association for Computing Machinery Journal. 1-4.
Department of Defense (2011). Unmanned Aerial Systems (UAS) Airspace Integration Plan. Retrieved from http://www.acq.osd.mil/sts/docs/DoD_UAS_Airspace_Integ_Plan_v2_(signed).pdf
Everstine, Brian (2014). Air Combat Command Chief reluctantly accepts Global Hawk over U-2. Air Force Times. Retrieved from http://www.airforcetimes.com/article/20140921/NEWS04/309210028/Air-Combat-Command-chief-reluctantly-accepts-Global-Hawk-over-U-2.
Grier, R. (2008). The redline of workload: Theory, research and design. A Panel. Presented at the 52nd Annual Meeting of the Human Factors and Ergonomics Society, September 22-26, in New York.
Human Factors and Ergonomics Society. “Definitions of Human Factors and Ergonomics.” Retrieved from http://www.hfes.org/web/educationalresources/hfedefinitionsmain.html
Lam, T. (2009). “Haptic Interface for UAV Teleoperation.” Retrieved from http://repository.tudelft.nl/view/ir/uuid%3A8de687c6-61da-410d-9550-5a72b020c07c/
Maclean, K. (2008). “Haptic Interaction Design for Everyday Interfaces.” Reviews of Human Factors and Ergonomics, Volume 4. Retrieved from http://web.stanford.edu/class/me327/readings/1-MacLean08-RHFE-Design.pdf
McCarley, J.S., & Wickens, C.D. (2006). “Human Factors Concerns in UAV Flight.” Retrieved from https://www.hf.faa.gov/docs/508/docs/uavFY04Planrpt.pdf
Navon, D. & Gopher, D. (1979). Mental workload: Its theory and measurement. New York: Plenum.
Northrop Grumman (2014). Global Hawk Capabilities. Retrieved from http://www.northropgrumman.com/Capabilities/GlobalHawk/Pages/default.aspx?utm_source=PrintAd&utm_medium=Redirect&utm_campaign=GlobalHawk+Redirect
Schlossberg, T. & Santora, M. (2014). “On 9/11 Anniversary, Looking Back and Ahead.” The New York Times. Retrieved from http://www.nytimes.com/2014/09/12/nyregion/new-york-pauses-to-remember-9-11-anniversary.html?_r=0
Thompson, L. (2014). “U-2 vs. Global Hawk: Why Drones Aren’t The Answer To Every Military Need.” Forbes. Retrieved from http://www.forbes.com/sites/lorenthompson/2014/02/20/u-2-vs-global-hawk-why-drones-arent-the-answer-to-every-military-need/
Salzer, Y., Oron-Gilad, T., Ronen, A., & Parmet, Y. (2011). “Vibrotactile On-Thigh Alerting System in the Cockpit.” Human Factors: The Journal of the Human Factors and Ergonomics Society. 118-131.
Wickens, C.D. (2008). “Multiple Resources and Mental Workload,” Human Factors: The Journal of the Human Factors and Ergonomics Society. 449-454.
Wickens, C. D. (2008). “Situational Awareness: Review of Mica Endsley’s 1995 Articles on Situational Awareness Theory and Measurement. Human Factors: The Journal of the Human Factors and Ergonomics Society. 397-402.