[drone-list] Defense Science Board urges greater autonomy for unmanned systems
Gregory Foster
gfoster at entersection.org
Tue Sep 11 12:09:31 PDT 2012
Wired Danger Room (Sep 11) - "The Pentagon Doesn't Trust Its Own
Robots":
[1]http://www.wired.com/dangerroom/2012/09/robot-autonomy/
Follow up on the DSB report by Spencer Ackerman [ [2]@attackerman ]
featuring commentary by Brookings' Peter W. Singer. At Brookings,
Singer currently supervises [3]nonresident fellow Noah Shachtman, the
editor of Wired's Danger Room blog.
gf
On 9/6/12 10:40 AM, Gregory Foster wrote:
Secrecy News (Sep 6) - "Greater Autonomy for Unmanned Military Systems
Urged" (cited in full)
[4]http://www.fas.org/blog/secrecy/2012/09/dsb_autonomy.html
The Department of Defense should focus on increasing the autonomy of
drones and other unmanned military systems, a new report from the
Defense Science Board said.
DoD should "more aggressively use autonomy in military missions," the
Board report said, because currently "autonomy technology is being
underutilized." See [5]"The Role of Autonomy in DoD Systems," Defense
Science Board, dated July 2012 and released last week.
"Autonomy" in this context does not mean "computers making independent
decisions and taking uncontrolled action." The Board is not calling
for the immediate development of [6]Skynet at this time. Rather,
autonomy refers to the automation of a particular function within
programmed limits. "It should be made clear that all autonomous
systems are supervised by human operators at some level," the [7]report
stressed.
Increased autonomy for unmanned military systems "can enable humans to
delegate those tasks that are more effectively done by computer... thus
freeing humans to focus on more complex decision making."
"However, the true value of these systems is not to provide a direct
human replacement, but rather to extend and complement human capability
by providing potentially unlimited persistent capabilities, reducing
human exposure to life threatening tasks, and with proper design,
reducing the high cognitive load currently placed on
operators/supervisors."
But all of that is easier said than done.
"Current designs of autonomous systems, and current design methods for
increasing autonomy, can create brittle platforms" that are subject to
irreversible error. There are also "new failure paths associated with
more autonomous platforms, which has been seen in friendly fire
fatalities.... This brittleness, which is resident in many current
designs, has severely retarded the potential benefits that could be
obtained by using advances in autonomy."
The Defense Science Board [8]report discusses the institutional
challenges confronting a move toward increasing autonomy, including the
obstacles posed by proprietary software. It offers an extended
discussion of conflict scenarios in which the enemy employs its own
autonomous systems against U.S. forces. The authors describe China's
"alarming" investment in unmanned systems, and encourage particular
attention to the relatively neglected topic of the vulnerability of
unmanned systems.
The [9]report includes some intriguing citations, such as a volume on
[10]"Governing Lethal Behavior in Autonomous Robots," and presents
numerous incidental observations of interest. For example:
"Big data has evolved as a major problem at the National Geospatial
Intelligence Agency (NGA). Over 25 million minutes of full motion
video are stored at NGA."
But new sensors will produce "exponentially more data" than full motion
video, and will overwhelm current analytical capabilities.
"Today nineteen analysts are required per UAV orbit [i.e. per 24 hour
operational cycle]. With the advent of Gorgon Stare, ARGUS, and other
Broad Area Sensors, up to 2,000 analysts will be required per orbit."
The government "can't hire enough analysts or buy enough equipment to
close these gaps."
HT [11]@saftergood,
gf
--
Gregory Foster || [12]gfoster at entersection.org
@gregoryfoster <> [13]http://entersection.com/
References
1. http://www.wired.com/dangerroom/2012/09/robot-autonomy/
2. http://twitter.com/attackerman
3. http://www.brookings.edu/experts/shachtmann.aspx
4. http://www.fas.org/blog/secrecy/2012/09/dsb_autonomy.html
5. http://www.fas.org/irp/agency/dod/dsb/autonomy.pdf
6. http://en.wikipedia.org/wiki/Skynet_%28Terminator%29
7. http://www.fas.org/irp/agency/dod/dsb/autonomy.pdf
8. http://www.fas.org/irp/agency/dod/dsb/autonomy.pdf
9. http://www.fas.org/irp/agency/dod/dsb/autonomy.pdf
10. http://books.google.com/books?id=rIsJ_QXDdEUC&printsec=frontcover#v=onepage&q&f=false
11. http://twitter.com/saftergood
12. mailto:gfoster at entersection.org
13. http://entersection.com/
_______________________________________________
drone-list mailing list
drone-list at lists.stanford.edu
Should you need to change your subscription options, please go to:
https://mailman.stanford.edu/mailman/listinfo/drone-list
If you would like to receive a daily digest, click "yes" (once you click above) next to "would you like to receive list mail batched in a daily digest?"
You will need the user name and password you receive from the list moderator in monthly reminders.
Should you need immediate assistance, please contact the list moderator.
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE
More information about the cypherpunks-legacy
mailing list