Thursday, October 9, 2014

ASCI 530 Case Analysis Reflection

For your case analysis project you worked collaboratively to research, develop, and present a topic associated with major theme of problems or issues associated with UAS design, operations, or regulations. Discuss the effectiveness of the Case Analysis tool in this course. Focus specifically on the utility of Case Analysis as a tool for decision making and how it has (or does not have) utility in your current line of work, future anticipated career, or past experiences (identify at least two examples). Provide any recommendations for how the process or project (e.g., requirements, format, group interaction, topical focus, etc.) could be improved to better support building and expanding student experience for the eventual or further development of their careers.

            The Case Analysis project in ASCI 530: Unmanned Systems was an extremely effective tool for testing our understanding of the course objectives. The five course objectives I chose to address in the project were:
1.)    Evaluate the challenges associated with the integration of UAS into the National Airspace System (NAS) and analyze the role UAS can play in overcoming those challenges.
2.)    Evaluate the differences in unmanned systems based upon their varying missions.
3.)    Evaluate the role of UAS in meeting the requirements of Federal Aviation Administration (FAA) regulatory requirements for the operation of UAS.
4.)    Evaluate the history and current state of the design and architecture of unmanned systems.
5.)    Evaluate, compare and contrast unmanned systems with comparable counterpart manned aircraft systems in regard to design, development, and operation.
The Case Analysis provided a good framework to address these course objectives. With regards to its’ utility in decision making, it definitely is a good starting point for entities such as the Federal Aviation Administration (FAA) and their infrastructure planning – specifically merging unmanned aerial vehicles (UAV) and manned airframes jointly in an air management perspective. On the design end, specifically for the defense contractors such as Northrop Grumman and General Atomics who develop unmanned vehicles such as the Global Hawk and Reaper as well as their accompanying Ground Control Stations (GCS), this case analysis could aid in future design iterations of their respective systems.
While it does not directly relate to the current acquisition program I work on, it could potentially relate to a future acquisition program that I could be tasked to work on either at Hanscom or at my follow on assignment in January of 2016. My previous acquisition program – Ground Multi-band Terminal was essentially the ground communications terminal of the Global Hawk operation, and was highly lauded by the operational user community for ease of use. Whenever there was an issue, our program office engineers would look into the specific issue, then publish a Knowledge-based instruction (KBI), when warranted to ensure that users had no further issues with the system. Whenever we sent out additions to the existing system, we would publish KBIs to walk the users through the integration of the new system piece. I actually would walk through the process (such as adding a new modem) and write the KBIs then I would have other people follow (line by line) the process I drafted to ensure that it made sense for the users that it would be pushed out to. Another one of my classmates on base is the Program Manager for the system that is essentially a text messaging program on the newer C-17s. Her program is another example of one where it is of paramount importance to involve the user/warfighter in all steps of the acquisition process; to include the fielding portion which involves getting constant warfighter feedback.
The most applicable aspect of the Case Analysis project in my particular profession is simply the importance of user-centered design. In a system where a human is expected to play a huge role in its’ operation, it is of utmost importance that the human operator is taken into account from a design perspective. Doing so will minimize the likelihood of mishaps attributed to human error.
With regards to the structure of the Case Analysis structure in ASCI 530, I wouldn’t change much. I thought it was great that there as a major project deliverable roughly every two weeks because it ensured that we were constantly working on the Case Analysis (and not saving the 20 page paper for the very end). The peer feedback/review process definitely helped as well – this was the first class that I’ve taken through Embry Riddle in which we were required to do this and it definitely helped me produce a great final product.

References

None

Monday, September 29, 2014

Combating Human Factors Challenges in Unmanned Aerial Systems (UAS) Cockpits

 The introduction of Unmanned Aerial Systems (UAS) into the airspace has revolutionized the world that we fly in today. With systems like the RQ-4 Global Hawk in use and tasked with the mission of high-altitude, long endurance, reconnaissance, pilots, from a remotely operated mission control element (MCE) can acquire near-real-time, high resolution imagery of large geographical areas 24 hours a day and in all types of weather conditions (Northrop Grumman, 2014). While this doesn’t quite render manned airframes such as the U-2 Dragon lady obsolete, because of the U-2’s ability to fly at 70,000 feet vice 55,000 feet (Lockheed Martin, 2014). UAS still give us an incredible advantage on the battlefield—risking only airframes over human operators in missions. This begs the question, however – does removing the pilot from the aircraft pose new, unexplored issues to the aviation realm? Additionally, how do issues such as data latency and the complete absence of haptic feedback affect the pilots’ perception of the dynamic environment their airframe is operating in? Is there anything that can be done on the design end to reduce these adverse effects? This paper will seek to answer these questions, as well as discuss the human factors involved in traditional flight, and compare these to human factors issues faced in remote aircraft operation.

Summary

 This paper will explore the human factors challenges associated with unmanned aerial system usage, to include, but not limited to, pilot performance in absence of haptic feedback and data latency and air traffic controller performance – what additional challenges do UAS bring to light from an airspace management perspective? Furthermore, from a psychological perspective, does the remote operation of aircraft serve as a breeding ground for pilot and sensor operator complacency? Moreover, does the complete removal of the crew from the air completely rob them of situational awareness and cognizance of their surroundings? From a sensation and perception standpoint, are there certain aspects of UAS design that need to be taken into account from the very beginning? With regards to the ASCI 530 learning outcomes, several of them will be addressed here to include:

1.) Evaluate the challenges associated with the integration of UAS into the National Airspace System (NAS) and analyze the role UAS can play in overcoming those challenges.
2.) Evaluate the differences in unmanned systems based upon their varying missions.
3.) Evaluate the history and current state of the design and architecture of unmanned systems.
4.) Analyze and explain functional requirements and capabilities of major unmanned systems, considering cost and weight estimation, basic aircraft performance, safety and reliability, lifecycle topics, vehicle subsystems, and system integration.
5.) Evaluate, compare and contrast unmanned systems with comparable counterpart manned aircraft systems in regard to design, development, and operation.

 Human Factors and Situational Awareness
 Before launching into an aviation human factors discussion, it is important to discuss human factors in general first. The Human Factors and Ergonomics Society (HFES) eloquently define human factors as a subject concerned with “the application of what we know about people, their abilities, characteristics, and limitations to the design of equipment they use, environments in which they function, and jobs they perform” (Human Factors and Ergonomics Society, 2014). This definition could not be more applicable to pilots interacting with remotely operated air vehicles. It is of utmost importance that human factors engineers take what is publicly available in research about pilots and pilot candidates, their abilities, and most importantly their limitations, in system design. Ultimately, it is their limitations that will dictate the success of the system; failure to design an unmanned aerial vehicle within the constraints of the human operator will result in aviation mishaps.

 Issue/Problem Statement
Unmanned Aerial Systems have revolutionized the world we operate in today. By allowing pilots to remotely control an aircraft from a safe Contiguous United States (CONUS) location or Outside Contiguous United States (OCONUS) location, we can effectively remove the human operator from the threat environment. However, while the pilot is removed from the threat environment, a whole array of human factors issues present themselves. There is typically a two second lag time between operator input and response in US Air Force remotely piloted drones (Bidwell, J., Holloway, A., & Davidoff, S., 2011). This lag time is different from the instantaneous control response that traditional pilots experience. Additionally, UAS pilots are forced to fly in absence of haptic feedback. Haptics, simply put is our touch sense, with haptic feedback more specifically as the feedback we receive from our tactile receptors (MacLean, 2008). Why is haptic feedback an issue of importance in the flying realm? Pilots are tasked to rely on all of their senses in flight to detect any changes in conditions that may lead to a mishap. Lack of haptic feedback in remotely operated airframes renders them void of all of the vibrations that occur as a result of wind buffets, potential bird strikes, etc. These vital pieces of feedback provide conventional pilots a good deal of information about the environment they are flying in and this lack of haptic feedback in UAS operations is something worth designing for in future UAS development. Overall, haptic feedback will improve the level of safety in UAS operation and also help with increased efficiency and pilot and sensor operator performance in degraded visual environments (Lam, 2009). Situational Awareness Another topic worthy of discussion before delving the questions posed at the paper is the topic of situational awareness. Micah Endsley defines situational awareness as, “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future” (Wickens, 2008). He breaks situational awareness up into three levels – perception, comprehension and projection. Simply put, this means properly understanding the current state of the environment and taking those cues and being able to project them to the future status of the environment. The example given in a 2008 Wickens articles applies this theory to an air traffic controller. The air traffic controller will first notice a change in trajectory (potentially indicating a conflict alert), comprehend this to mean that two aircraft are on a converging trajectory, and then project this status to the future – the two planes colliding if one of the pilots-in-command fails to change their path (Wickens, 2008). To tie this situational awareness discussion back to one of the questions posed at the beginning, “does the complete removal of the crew from the air completely rob them of situational awareness and cognizance of their surroundings?” I would argue that it does – the definition of situational awareness specifically mentions “the perception of the elements in the environment within a volume of time and space.” Remotely operating a UAV from a geographically-separated GCS completely negates the environment portion of this definition as the pilot and crew are not directly in the flight environment.

 Multiple Resources Theory

 Multitasking is wide-spread in our society today – we would be pressed to go a grocery store without seeing someone walking through the aisles on their smartphone, or someone walking to the subway without a smartphone glued to their hand. In his article titled, “Multiple Resources and Mental Workload,” Christopher Wickens seeks to understand the extent to which dual-task performance leads to a decrement in time-sharing ability (Wickens, 2008). He attempts to do this through the multiple resource theory. He breaks up the multiple resource architecture into three main components – demand, resource overlap, and allocation policy (Wickens, 2008). Demand can be conceptually broken up into two regions of task demand – the first, is one in which demand is less than the capacity of resources available; this is ideal because it means that the individual, or the pilot in the UAS Ground Control Station (GCS) has additional mental resources to use in the event that an unexpected event in flight occurs. The second region is where demand exceeds capacity, and this is the less-desirable situation because, as with any economic system, when demand is greater than capacity, performance unfortunately breaks down (Navon & Gopher, 1979). The distinction between these two demand regions is typically referred to as the “red line” of workload (Grier, 2008). In my opinion, it is of utmost importance for UAS designers to identify the red line with regards to workload for the human operator and to take that into heavy consideration for subsequent system design. By identifying the “red line” with regards to workload and mental capacity for UAS pilots, we can ensure that once they reach this point, there are adequate assistive resources in the cockpit to alleviate some of the workload from the human operator, and bring them back down to that first region of demand—one in which the demand is less than the capacity of resources available. UAS operators are constantly fighting a battle against mental workload – they are faced with the conventional challenges of a cockpit: the various visual displays and auditory warnings as well as the unconventional challenges of remotely operating an aircraft in the absence of environmental cues and lack of instantaneous feedback. A 2009 essay titled “Haptic Interface for UAS Teleoperation,” discusses how the physical separation between the human operator and vehicle leads to a decrease in situational awareness. It goes on to discuss one of the major differences between manned vs. unmanned airframes – how the unmanned aerial vehicle pilot “lacks various multiple-sensory information such as sound, motions, and vibrations of the airframe.” Furthermore, the author discusses how the operator is only provided with visual information from camera images with limited resolution and field of view. Additionally, the camera may not always be pointed in the direction of motion – the UAV could be traveling North, but the camera could be oriented towards an enemy hot spot towards the Southwest. This is a situation unique to UAV operation, and once again, needs to be looked at for improvement. Moreover, he links the lowered situational awareness (due to the degraded camera imagery/field of view) to reduced safety in operation and high operator workload (Lam, 2009). Significance of Problem The human factors issues inherent in UAS usage are significant because of the growing use of UAS in airspace operation. Some government and industry players believe that UAS will begin commercial civil operations in the next 5-10 years. They also believe that UAS operations will mirror the same “oceanic, en route and terminal airspace where air carriers operate” (Airline Pilots Association, 2011). The Department of Defense also foresees a greater application of UAS in the future and is currently conducting near-term, mid-term and far-term activities to integrate UAS into the National Airspace System. By 2015, the Joint UAS Center of Excellence (JUAS COE) estimates that the DoD will have 197 units based at 105 locations—a staggering 35% increase in units and 67% increase in UAS operational locations (Department of Defense, 2011). With several agencies pushing for widespread use of UAS in the upcoming years, it is of utmost importance that developers continually work to improve all parts of the existing UAS architecture, to include the human factors challenges.

 Alternative Actions
 With regards to alternative actions, in my opinion, piloted airframes should only be used in lieu of unmanned aircraft in missions that may require a pilot to potentially navigate out of a dangerous situation where passenger/crew life is threatened – example, a C-17 flight across the Pacific Ocean with 60 troops on board; this is a situation that would warrant an actual pilot onboard vice a pilot controlling the airframe from a geographically separated location. This allows the pilot in command the ability to sense any environmental cues indicating an issue during flight (smoke, bird strikes, etc). Additionally, from a commercial airline safety perspective, in the interest of passenger safety, it makes most sense to have a pilot on-board to deter hijacking attempts. September 11, 2001 is a day that no American will ever forget. On that fateful day, Al Qaeda operatives hi-jacked four commercial airliners, flying them into both World Trade Center towers, the Pentagon, and a field in Pennsylvania (it is suspected that this plane was intended to be crashed into either the White House or Capitol Building). The passengers on board the fourth plane (which tragically ended up crashing in a field in Pennsylvania) attempted to regain control of the aircraft from the hijackers (Schlossberg, T. & Santora, M., 2014). Sadly, while the unarmed pilots on board were unable to overtake the hijackers and navigate their plane to safety, they still presented an additional challenge on board – just the mere presence of a pilot on-board versus a plane with no pilot on board adds a new layer of difficulty for would-be hijackers on board commercial flights.

 Recommendations
One recommendation that I would have would be for UAS developers would be to look into the feasibility of incorporating vibrotactile “on-thigh” alerts in ground control stations. A 2011 Human Factors and Ergonomics Society article discusses this possibility. The goals of the study were to “explore the appropriateness of the thigh as a placement to convey vertical directional cues to a seated operator, to assess the thigh as a potential locus for directional orienting, and to assess the effect of the message expressed by the vibrotactile cue (fight or flight) on performance” (Salzer, Y., Oron-Gilad, T., Ronen, A. & Parmet, Y., 2011). The authors hypothesized that adding vibrotactile cues to existing anti-collision alert and auditory cues would improve performance in directional decision tasks. The researchers found that vibrotactile cues oriented towards the response direction were preferred to visual cues that pointed to the direction of the hazard to avoid (Salzer, et al. 2011). Additionally, they found that the auditory cues in the study (used in current anti-collision alert systems) showed little to no benefit in the experimental design of this study. This was mostly attributed to the fact that auditory alerts tend to take a few seconds before saying the word “collision” – this lag time slows a pilots’ reaction time (Salzer, et al. 2011). In summary, the researchers believed that they developed a good foundation for future development of on-thigh vibrotactile displays to convey direction in the vertical plane. They recommended that future studies look at two tactile display modes, coupled with a greater visual load (to simulate a true flight environment) as well as the use of a common control (like a joystick) that would add more meaning to the vibrotactile cue (i.e., vibrotactile cue indicates a collision to the right, pilot can instinctively move the joystick to the left to avoid it) (Salzer, et al. 2011). Another recommendation that I have to mitigate human factors issues in UAS cockpits would be to place more emphasis in crew selection. The UAS cockpit is known to be a taxing work environment, regardless of the pilot and sensor operator working in it. However, certain individuals are predisposed to do better in a task-saturated environment such as a UAS cockpit (or Ground Control Station) than others. My recommendation is to raise the bar for UAS crew screening programs to ensure the most suitable individuals are selected for the job. A 2006 McCarley and Wickens article suggests further research on techniques to understand and facilitate crew communications, with a particular emphasis on inter-crew coordination during the hand off of UAV control form one team of operators to another (McCarley, J.S. & Wickens, C.D., 2006). A real-world example of this is a UAS crew based out of a Ground Control Station at Creech AFB, Nevada handing over control to a crew based out of a Ground Control Station at Kandahar Airfield, Afghanistan. I would not recommend cleansing the US Air Force inventory of conventional manned reconnaissance planes such as the U-2 Dragon Lady. A September 2014 Air Force Times article discusses the Air Force’s decision to start retiring the U-2 fleet starting in 2015. The 2015 fiscal budget calls for retiring the U-2 fleet and purchasing RQ-4 Global Hawks to assume the high-altitude, intelligence, surveillance and reconnaissance. The service was drawn towards the Global Hawk in-lieu of the aging Dragon Lady due to the reduced sustainment costs for the former. The Air Force projects spending $487 million dollars for the development and installation of a “universal payload adapter,” essentially allowing the sensor equipment and the cameras from the U-2 to be retrofitted onto Global Hawks. (Everstine, 2014). General Michael Hostage, the commander of Air Combat Command, which is the major Air Force Combatant Command to use the Global Hawk in operations, is well aware of the shortcomings of the RQ-4 in comparison to the U-2. He is quoted as saying, “the Global Hawk does not have the same capability as the U-2, which will mean that the military will suffer once the Dragon Lady is retired” (Everstine, 2014).


 References
Airline Pilots Association (2011). Unmanned Aircraft Systems: Challenges for Safely Operating in the National Airspace System. Retrieved from http://www.alpa.org.

Bidwell, J., Holloway, A., & Davidoff, S. (2014). “Measuring operator anticipatory inputs in response to time-delay for teleoperated human-robot interfaces.” Association for Computing Machinery Journal. 1-4.

 Department of Defense (2011). Unmanned Aerial Systems (UAS) Airspace Integration Plan. Retrieved from http://www.acq.osd.mil/sts/docs/DoD_UAS_Airspace_Integ_Plan_v2_(signed).pdf

Everstine, Brian (2014). Air Combat Command Chief reluctantly accepts Global Hawk over U-2. Air Force Times. Retrieved from http://www.airforcetimes.com/article/20140921/NEWS04/309210028/Air-Combat-Command-chief-reluctantly-accepts-Global-Hawk-over-U-2.

Grier, R. (2008). The redline of workload: Theory, research and design. A Panel. Presented at the 52nd Annual Meeting of the Human Factors and Ergonomics Society, September 22-26, in New York. Human Factors and Ergonomics Society. “Definitions of Human Factors and Ergonomics.” Retrieved from http://www.hfes.org/web/educationalresources/hfedefinitionsmain.html

Lam, T. (2009). “Haptic Interface for UAV Teleoperation.” Retrieved from http://repository.tudelft.nl/view/ir/uuid%3A8de687c6-61da-410d-9550-5a72b020c07c/

Maclean, K. (2008). “Haptic Interaction Design for Everyday Interfaces.” Reviews of Human Factors and Ergonomics, Volume 4. Retrieved from http://web.stanford.edu/class/me327/readings/1-MacLean08-RHFE-Design.pdf

McCarley, J.S., & Wickens, C.D. (2006). “Human Factors Concerns in UAV Flight.” Retrieved from https://www.hf.faa.gov/docs/508/docs/uavFY04Planrpt.pdf

Navon, D. & Gopher, D. (1979). Mental workload: Its theory and measurement. New York: Plenum. 

Northrop Grumman (2014). Global Hawk Capabilities. Retrieved from http://www.northropgrumman.com/Capabilities/GlobalHawk/Pages/default.aspx?utm_source=PrintAd&utm_medium=Redirect&utm_campaign=GlobalHawk+Redirect

Schlossberg, T. & Santora, M. (2014). “On 9/11 Anniversary, Looking Back and Ahead.” The New York Times. Retrieved from http://www.nytimes.com/2014/09/12/nyregion/new-york-pauses-to-remember-9-11-anniversary.html?_r=0

Thompson, L. (2014). “U-2 vs. Global Hawk: Why Drones Aren’t The Answer To Every Military Need.” Forbes. Retrieved from http://www.forbes.com/sites/lorenthompson/2014/02/20/u-2-vs-global-hawk-why-drones-arent-the-answer-to-every-military-need/

Salzer, Y., Oron-Gilad, T., Ronen, A., & Parmet, Y. (2011). “Vibrotactile On-Thigh Alerting System in the Cockpit.” Human Factors: The Journal of the Human Factors and Ergonomics Society. 118-131.

 Wickens, C.D. (2008). “Multiple Resources and Mental Workload,” Human Factors: The Journal of the Human Factors and Ergonomics Society. 449-454.

Wickens, C. D. (2008). “Situational Awareness: Review of Mica Endsley’s 1995 Articles on Situational Awareness Theory and Measurement. Human Factors: The Journal of the Human Factors and Ergonomics Society. 397-402.

Monday, September 22, 2014

UAS Missions

• Select three platforms capable of performing the mission and obtain an appropriate reference citation for each • Discuss any considerations relative to the mission and if they correlate to the performance of any related mission execution tasks • Identify the benefits and challenges associated with performing the particular UAS mission you are highlighting • Identify and discuss at least two legal and or ethical challenges to the specific mission you are highlighting The use of Unmanned Aerial Systems (UAS) has grown worldwide over the past decade and is expected to continue growing at astronomical rates. Presently, there are more military than civilian uses for UAS and this is mainly due to the military’s early adoption of UAS, funding and ease with regulation and certification that is not as available to civilian development (Embry Riddle Aeronautical University, 2014). Unmanned systems are also being used more and more for commercial applications to include: remote inspection of pipelines and hydroelectric installations, surveillance of forest fires, observation of critical natural resources, and assessing natural disasters (Fire Chief, 2014). Drones equipped with both cameras and sensor payloads have been utilized by both military and border control agencies for years now, in attempts to improve situational awareness. The commercialization of drone technology has made UAS in the firefighting and emergency response career fields more accessible. An August 2013 CNN Money article discusses how drones can revolutionize the fight against wildfires. Fighting wildfires with UAS actually assists firefighters in gathering information, at the benefit of not putting anyone directly into danger –comparable to the benefit derived from their use in a combat zone (Lobosco, 2013). A March 2014 Fire Chief article lists five drone technologies with a potential to be used for firefighting – ELIMCO’s E300, L-3 Communication’s Viking 400-S, Information Processing Systems’ MCV, Sensefly’s eBee, and Kaman’s Unmanned Aerial Truck (UAT) or UAS (Fire Chief, 2014). For this short paper, I will be discussing L-3 Communication’s Viking 400-S, the K-MAX UAS and ELIMCO’s E300. L-3 Communication’s Viking 400-S was developed with Autonomous Take-Off and Landing (ATOL) technology enabled by L-3 Unmanned Systems’ flightTEK system. Missions are accomplished through the utilization of GPS waypoint navigation capable of being changed during flight. Some of its’ key features include 8-to-12 hour flight mission endurance, a 75-100 lb payload capacity, and a 75+ nautical mile (nmi) range (L-3 Unmanned Systems, 2014). Future application of the Viking 400-S could include carrying chemical, biological, radiological and nuclear (CBRN) detectors for hazmat emergencies. A first responder could send the Viking to first collect information from a hazmat site, and send it back; allowing the first responder team to adequately plan before reacting and exposing the team to any unnecessary harm on chemical exposure. The K-MAX UAS is a multi-mission helicopter is capable of autonomously or remotely carrying 6,000 pounds of payload, enabling remote delivery of food, water and fuel to individuals on the ground. Its primary mission is battlefield cargo resupply for the United States Military and it does so flying higher altitudes than competitor rotary-wing UAS (Lockheed Martin, 2014). The K-MAX UAS could potentially be used to drop off supplies to firefighters and emergency responders on the ground – at the benefit of not endangering any additional personnel. ELIMCO’s E300 is classified as a “light aerial surveillance system,” capable of operating in both day and night time missions through electric propulsion. Some of its specifications include a 2-4 kg payload, a max endurance of 2 hours, a operational range of 45 km and a maximum altitude of 5,000 feet. The system is presently used in the Andalusia region of Spain to assist in wildfire tracking at night. This system could easily be used in the United States for wildfire tracking as well; its sturdy and light construction allow for instantaneous and immediate use. With regards to considerations relative to the mission, the biggest ones are intended use and operation as well as cost and these are directly tied to the execution of mission-related tasks. The three frames listed above all have different intended purposes/usage in the firefighting realm and the acquisition of one or all three by a firefighting department would all depend on the particular mission and any cost constraints. For example, ELIMCO’s E300 is relatively lightweight, portable and likely inexpensive to produce because of these attributes. However, it can only realistically be used for fire surveillance. If the intent or mission is supply delivery, a firefighting agency would need the financial resources to purchase a UAS like the K-MAX UAS, which is capable of carrying a 6,000 lb payload – conversely, this acquisition would cost the agency considerably more than if it were to buy the E-300. There are several benefits and challenges to utilizing UAS in aerial firefighting/firefighting support functions. The biggest benefit that this concept provides is safety – remotely surveying a fire beforehand enables emergency response forces to first gather intelligence on a situation before acting. Furthermore, the use of UAS in dropping supplies off to firefighters on the ground instantly mitigates the threat to the pilot who traditionally would fly the helicopter into the fire area to drop off supplies. Some challenges to utilizing UAS in firefighting include issues with airspace. This is a constant battle between proponents for widespread UAS usage and the Federal Aviation Administration (FAA). Potential collisions between UAS and UAS or UAS and manned airframes are of great concern to the FAA and adequate planning still needs to be done on the airspace end before UAS can even be considered for 1.) widespread commercial use 2.) use in firefighting. Regarding legal or ethical challenges to UAS use in firefighting, I cannot particularly think of any ethical challenges at the moment. For UAS use in a military sense, I could definitely see some ethical issues – is it really right to have the capability to take out enemies without putting oneself in direct risk? What are the boundaries for use – will Predator and Reaper munitions be dropped with more discretion due to the fact that they are being remotely operated? The intent of this short paper is not to answer these questions, they are just simply a surface level guess at some of the ethical concerns with UAS usage in a military sense – I cannot think of any that would arise in a firefighting sense. Legally, the biggest challenges I see are just the issues with the Federal Aviation Administration and airspace – a battle that is still being fought between proponents for commercial UAS usage and the FAA. Additionally, I foresee issues arising from any potential mid-air collisions between firefighting UAS and piloted airframes and firefighting UAS and other UAS. Or potential issues if a firefighting UAS crashes into an individuals’ home and causes damage, injury or death. References CAE (2014). UAS Mission Solutions. Retrieved from http://www.cae.com/defence-and-security/simulation-products-solutions/uas-mission-training-systems/ ELIMCO (2014). Unmanned Aerial Vehicle – E300: Light Aerial Surveillance System. Retrieved from http://www.elimco.com/eng/p_UAV-E300_24.html Embry Riddle Aeronautical University (2014). ASCI 530 Module 6 Presentation – Civilian and Military Mission – Specific Systems. Retrieved from http://ernie.erau.edu. Fire Chief (2014). 5 Drone Technologies for Firefighting. Retrieved from http://www.firechief.com/2014/03/20/5-drone-technologies-firefighting/ KAMAN (2014). Unmanned Aerial Systems. Retrieved from http://www.kaman.com/aerospace/aerosystems/air-vehicles-mro/products-services/unmanned-aerial-systems/ Lockheed Martin (2014). KMAX. Retrieved from http://www.lockheedmartin.com/us/products/kmax.html L-3 Unmanned Systems (2014). Viking 400-S. Retrieved from http://www2.l-3com.com/uas/products/r_viking_400.htm Lobosco, K. (2013). “Drones can change the fight against wildfires.” CNN Money. Retrieved from http://money.cnn.com/2013/08/19/technology/innovation/fire-fighting-drones/

Thursday, September 4, 2014

Monitoring UAS in the National Airpsace System (NAS)

How can the separation of unmanned aircraft be monitored and maintained (among other unmanned aircraft and manned aircraft) in the National Airspace System (NAS)? What considerations need to be made for varying sizes (i.e., Group 1 to 5) and airframes of UAS (e.g., fixed-wing, rotary-wing, and lighter than air)? What technology is currently employed by manned aircraft and is it adaptable for use with unmanned? How can the separation of unmanned aircraft be monitored and maintained (among other unmanned aircraft and manned aircraft) in the National Airspace System (NAS)? This is a question still being asked, and unmanned aircraft developers are continuing to evaluate the feasibility of integration (Embry Riddle Aeronautical University, 2014). Communications between the ground control station (GCS) and the unmanned aircraft in the air alone is complex enough – a typical UAS communication system is composed of an antenna, transmitter, receiver, transceiver, and a multiplexer. A typical communications exchange consists of an uplink (UAV control) and downlink (UAV status) – the downlink is typically comprised of the following sections – header, inertial measurement unit (IMU) readings, flight control values, power, warnings, communications, payload, GPS and sense and avoid (Embry Riddle Aeronautical University, 2014). A great deal of the communications system on board and on the ground of a UAS is dedicated solely to the communication between the GCS and the UAV; as mentioned in the introduction, UAV developers are still looking at the feasibility of integration into the National Airspace System. The Department of Defense (DoD) Airspace Integration Plan classifies UAS into five groups. Group 1 UAS are hand-launched, self-contained portable systems employed for a small unit or base security. Group 2 UAS are typically small to medium in size and support intelligence, surveillance and reconnaissance (ISR) requirements. Group 3 UAS operate at medium altitude with medium to long range endurance. Group 4 UAS are relatively large, operate at medium to high altitudes and have extended range and endurance. Group 5 UAS are the largest, operate at medium to high altitudes, and have the greatest range, endurance and airspeed capabilities (Department of Defense, 2011). With regards to the integration into the National Airspace System, UAV class is definitely something worth considering since each class occupies a different level or tier of airspace. The challenge is not just about incorporating UAS into the National Airspace System as a whole, but integrating each individual UAS class into the appropriate level of airspace in the NAS. Another concern is the de-confliction of airspace with rotary-wing airframes (which typically fly lower than fixed-wing airframes), and lighter-than-air airframes. Presently, current operation of UAS, with just a few exceptions, is limited to military airspace, meaning on ranges owned and controlled by the military or in zones of military conflict (Austin, 2010), such as Afghanistan or Iraq. The biggest concern with integrating UAS is the potential for collisions between UAVs and manned airframes, or UAVs and other UAVs and the simple fact that there is no dedicated airspace for UAVs. A 2014 NASA Project titled, “Unmanned Aircraft Systems Airspace Operations Challenge,” tasks competitors to develop key technologies that will make UAS integration into the National Airspace System actually feasible. Before UAVs can safely operate in the same airspace as other UAVs or manned airframes, we need to ensure that the operators as well as the unmanned airframes have the ability to successfully “sense and avoid” other traffic. The competition is split into two phases – the first phase focuses on the important aspects of safe airspace operations to include separation assurance, 4-D trajectories, ground control operations and uncooperative air traffic detection. It is aimed at encouraging competitors to get a head start on developing skills for phase two of the competition (NASA, 2014). With regards to existing technology on manned aircraft, they currently operate with transponders or Automatic Dependent Broadcast – System (ADS-B) as well as Airborne Collision Avoidance System (ACAS) installed. This helps manned aircraft identify other aircraft in the airspace and aids in collision avoidance (Airline Pilots Association, 2011). In my opinion, these systems are definitely adaptable for UAS use; but that isn’t the issue with their implementation. The issue at hand is that ultimately, systems cannot completely take the place of a human operator. While there may be a system on board a UAV to detect other airframes in the airspace, it is impossible for a UAS to react to unannounced malfunctions. There are simply things that a pilot can sense on board, be it through smell (smoke indicating a flight issue), touch (vibrations – haptic feedback), etc., that systems on board cannot detect and relay to the human operator on the ground in a timely manner. References Airline Pilots Association (2011). Unmanned Aircraft Systems: Challenges for Safely Operating in the National Airspace System. Retrieved from http://www.alpa.org. Austin, R. (2010). Unmanned Aircraft Systems: UAVs design, development, and deployment. Chichester, U.K.: John Wiley & Sons Ltd. Department of Defense (2011). Unmanned Aerial Systems (UAS) Airspace Integration Plan. Retrieved from http://www.acq.osd.mil/sts/docs/DoD_UAS_Airspace_Integ_Plan_v2_(signed).pdf Embry Riddle Aeronautical University (2014). ASCI 530 Module 4 Presentation – Command, Control, and Communications (C3) Systems. Retrieved from http://ernie.erau.edu. NASA (2014). Unmanned Aircraft Systems Airspace Operations Challenge (UAS AOC). Retrieved from http://www.nasa.gov/directorates/spacetech/centennial_challenges/uas/

Saturday, August 16, 2014

About Me

I'm a First Lieutenant in the US Air Force currently deployed to Afghanistan and enrolled in ASCI 530 through Embry Riddle Aeronautical University. I'm stationed near Boston, MA stateside and enjoying my time in the service so far. Excited about this class and this blog, hope you all enjoy reading it!

_________________________________________________________________________________

What is the purpose of this blog you might ask? I'm hoping to explore different uses for Unmanned Aerial Systems (UAS) in the world we function in today. UAS to deliver Chinese take-out and Pizza, why not? What about using them in the forestry and fire prevention industry? Law enforcement? This blog will seek to explore these among several other uses of UAS in our everyday operations.

UAS Post 1

Identify an example of an early UAS design (historical, pre-1970s) and compare to a contemporary design (current, 2000s+) that evolved from the early design. Discuss how the two systems are similar, how they differ, and design changes that occurred as the system evolved. What new technology might influence future evolution of the design or system capability?

Army SD-1

            In the mid 1950’s, the Army started experimenting with equipping target drones with small cameras for battlefield reconnaissance. They developed the SD-1 (Surveillance Drone) from the Radioplane RP-71 Target Drone. The SD-1 carried a KA-20A daylight camera capable of taking 95 photos or the KA-39A infrared night camera that could take 10 photos. Pilots would launch the drone with rocket-assisted takeoff (RATO), tracked by radar and the pilot on the ground would control the SD-1 through radio commands over the span of a standard 30 min flight. The SD-1 drone combined with its equipment was named the AN/USD-1 and is lauded as the world’s first successful surveillance Unmanned Aerial Vehicle (UAV) (Zaloga, 10). The SD-1 served as a corner stone for many of the technologies used in following years and furthermore served as a precursor for present-day reconnaissance UAVs.

US Air Force RQ-4 Global Hawk

            The present-day iteration of the previously discussed Army SD-1 drone is Northrop Grumman’s RQ-4 Global Hawk. The Global Hawk is a high-altitude, long-endurance UAV responsible for intelligence, surveillance and reconnaissance (ISR) missions. It is capable of flying up to 65,000 feet for up to 35 hours at speeds nearing 340 knots. Furthermore, the RQ-4 can process the imagery for an area the size of Illinois in a single mission (Northrop Grumman, 2014). More specifically, the Global Hawk is equipped with an extremely sophisticated sensor suite. The Enhanced Integrated Sensor Suite (EISS) can pinpoint stationary and moving targets with unrivaled accuracy. Furthermore, it can transmit imagery and position information from 60,000 feet and is unprecedentedly clear. The EISS integrates a synthetic aperture radar (SAR) antenna with a ground moving target indicator (GMTI) and a high resolution electro-optical (EO) digital camera and infrared (IR) sensor (Raytheon, 2014).

SD-1 vs RQ-4

            The SD-1 and RQ-4 are similar in intent and purpose – both systems are designed for reconnaissance missions. The starting stages of the SD-1 were geared towards use with guided missiles, but eventually evolved into a reconnaissance focused mission; this was further solidified with the addition of the KA-20A/KA-39A cameras (Zaloga, 10). This capability has been further enhanced n the RQ-4 Global Hawk, which has logged more than 8,000 combat hours conducting ISR missions (Northrop Grumman, 2014). While similar in mission, the SD-1 and RQ-4 are different in actual operation. The SD-1 was launched using rocket-assisted take off, while the RQ-4 is powered by a Rolls Royce AE3007H turbofan engine (Northrop Grumman, 2014). This is a stark contrast to the SD-1’s power plant, which was non-existent.

            With any system that aspires to mature and increase in ease of use and popularity, the Global Hawk’s capability shrunken down into a smaller operational footprint could definitely influence future design evolution and system capability. This reduced footprint applies to the actual airborne RQ-4 as well as the mission control element (MCE) on the ground at both the home station as well as the MCE in the deployed environment. Furthermore, a reduction in the data lag time would also help with the future evolution of the system as this would reduce the human factors issues inherent in any UAV operation. One of the biggest issues with UAV operation is the responsiveness of the system – the human operator will make a control input on the ground, and it will take a couple seconds for the system to respond. This is something that we as human operators have to be willing to accept when remotely operating aircraft – however, it is not a reason to be complacent about the issue either. Efforts could be placed towards the system programming to reduce the lag time in UAV operation.



References

Northrop Grumman (2014). RQ-4 Block 10 Global Hawk. Retrieved from http://www.northropgrumman.com/Capabilities/RQ4Block10GlobalHawk/Pages/default.aspx

Northrop Grumman (2014) RQ-4 Global Hawk: High Altitude, Long-Endurance Unmanned Aerial Reconnaissance System Fact Sheet. Retrieved from http://www.northropgrumman.com/Capabilities/RQ4Block10GlobalHawk/Documents/HALE_Factsheet.pdf

Raytheon (2014). Global Hawk Enhanced Integrated Sensor Suite. Retrieved from http://www.raytheon.com/capabilities/products/globalhawk_iss/


Zaloga, S.J. (2008). Unmanned Aerial Vehicles: Robotic Air Warfare 1917-2007. Oxford, U.K.: 2008.