The Department of Defense should focus on increasing the autonomy of drones and other unmanned military systems, a new report from the Defense Science Board said.
DoD should “more aggressively use autonomy in military missions,” the Board report said, because currently “autonomy technology is being underutilized.” See “The Role of Autonomy in DoD Systems,” Defense Science Board, dated July 2012 and released last week.
“Autonomy” in this context does not mean “computers making independent decisions and taking uncontrolled action.” The Board is not calling for the immediate development of Skynet at this time. Rather, autonomy refers to the automation of a particular function within programmed limits. “It should be made clear that all autonomous systems are supervised by human operators at some level,” the report stressed.
Increased autonomy for unmanned military systems “can enable humans to delegate those tasks that are more effectively done by computer… thus freeing humans to focus on more complex decision making.”
“However, the true value of these systems is not to provide a direct human replacement, but rather to extend and complement human capability by providing potentially unlimited persistent capabilities, reducing human exposure to life threatening tasks, and with proper design, reducing the high cognitive load currently placed on operators/supervisors.”
But all of that is easier said than done.
“Current designs of autonomous systems, and current design methods for increasing autonomy, can create brittle platforms” that are subject to irreversible error. There are also “new failure paths associated with more autonomous platforms, which has been seen in friendly fire fatalities…. This brittleness, which is resident in many current designs, has severely retarded the potential benefits that could be obtained by using advances in autonomy.”
The Defense Science Board report discusses the institutional challenges confronting a move toward increasing autonomy, including the obstacles posed by proprietary software. It offers an extended discussion of conflict scenarios in which the enemy employs its own autonomous systems against U.S. forces. The authors describe China’s “alarming” investment in unmanned systems, and encourage particular attention to the relatively neglected topic of the vulnerability of unmanned systems.
The report includes some intriguing citations, such as a volume on “Governing Lethal Behavior in Autonomous Robots,” and presents numerous incidental observations of interest. For example:
“Big data has evolved as a major problem at the National Geospatial Intelligence Agency (NGA). Over 25 million minutes of full motion video are stored at NGA.”
But new sensors will produce “exponentially more data” than full motion video, and will overwhelm current analytical capabilities.
“Today nineteen analysts are required per UAV orbit [i.e. per 24 hour operational cycle]. With the advent of Gorgon Stare, ARGUS, and other Broad Area Sensors, up to 2,000 analysts will be required per orbit.”
The government “can’t hire enough analysts or buy enough equipment to close these gaps.”
Source: FAS