<nettime> Pascal Zachary: Rules for the Digital Panopticon (IEEE)
----- Forwarded message from Patrice Riemens <patrice@xs4all.nl> ----- Date: Thu, 10 Oct 2013 20:41:55 +0200 From: Patrice Riemens <patrice@xs4all.nl> To: nettime-l@kein.org Subject: <nettime> Pascal Zachary: Rules for the Digital Panopticon (IEEE) Message-ID: <0adf8f7abff38f778a06f8b776729759.squirrel@webmail.xs4all.nl> User-Agent: SquirrelMail/1.4.18 Reply-To: a moderated mailing list for net criticism <nettime-l@mail.kein.org> original to: http://spectrum.ieee.org/computing/software/rules-for-the-digital-panopticon Rules for the Digital Panopticon The technologies of persistent surveillance can protect us only if certain boundaries are respected By G. Pascal Zachary (Posted 20 Sep 2013) For centuries, we humans have lacked the all-knowing, all-seeing mechanisms to credibly predict and prevent bad actions by others. Now these very powers of preemption are perhaps within our grasp, thanks to a confluence of technologies. In the foreseeable future, governments, and perhaps some for-profit corporations and civil-society groups, will design, construct, and deploy surveillance systems that aim to predict and prevent bad actionsand to identify, track, and neutralize people who commit them. And when contemplating these systems, lets broadly agree that we should prevent the slaughter of children at school and the abduction, rape, and imprisonment of women. And lets also agree that we should thwart lethal attacks against lawful government. Of late, the U. S. government gets most of the attention in this arena, and for good reason. The National Security Agency, through its vast capacity to track virtually every phone call, e-mail, and text message, promises new forms of preemption through a system security experts call persistent surveillance. The Boston Marathon bombing, in April, reinforced the impression that guaranteed prevention against unwanted harm is elusive, if not impossible. Yet the mere chance of stopping the next mass shooting or terror attack persuades many people of the benefits of creating a high-tech version of the omniscient surveillance construct that, in 1787, the British philosopher Jeremy Bentham conceived as a panopticon: a prison with a central viewing station for watching all the inmates at once. Some activists complain about the potential of such a system to violate basic freedoms, including the right to privacy. But others will be seduced by the lure of techno fixes. For example, how could anyone object to a digital net that protects a school from abusive predators? Ad hoc surveillance will inevitably proliferate. Dropcam and other cheap surveillance programs, already popular among the tech-savvy, will spread widely. DIY and vigilante panopticons will complicate matters. Imagine someone like George Zimmerman, the Florida neighborhood watchman, equipped not with a gun but with a digital surveillance net, allowing him to track pretty much anythingon his smartphone. With data multiplying exponentially and technology inexorably advancing, the question is not whether an all-encompassing surveillance systems will be deployed. The question is how, when, and how many. In the absence of settled laws and norms, the role of engineers looms large. They will shoulder much of the burden of designing systems in ways that limit the damage to innocents while maximizing the pressures brought to bear on bad guys. But where do the responsibilities of engineers begin and end? It is too early to answer conclusively, but engineers would do well to keep a few fundamental principles in mind: Keep humans in the loop, but insist they follow the rules of the road. Compiling and analyzing data can be done by machines. But it would be best to design these surveillance systems so that a human reviews and ponders the data before any irreversible actions are taken. If citizens want to spy on one another, as they inevitably will, impose binding rules on how they do so. Design self-correcting systems that eject tainted or wrong information fast and inexpensively. Create a professional ethos and explicit standards of behavior for engineers, code writers, and designers who contribute significantly to the creation of panopticon-like systems. Delete the old stuff routinely. Systems should mainly contain real-time data. They should not become archives tracing the lives of innocents. Engineers acting responsibly are no guarantee that panopticons will not come to control us. But they can be part of getting this brave new world right. About the Author G. Pascal Zachary is the author of Endless Frontier: Vannevar Bush, Engineer of the American Century (Free Press, 1997). He teaches at Arizona State University. # distributed via <nettime>: no commercial use without permission # <nettime> is a moderated mailing list for net criticism, # collaborative text filtering and cultural politics of the nets # more info: http://mx.kein.org/mailman/listinfo/nettime-l # archive: http://www.nettime.org contact: nettime@kein.org ----- End forwarded message ----- -- Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org ______________________________________________________________ ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
----- Forwarded message from Patrice Riemens <patrice@xs4all.nl> ----- Date: Thu, 10 Oct 2013 20:41:55 +0200 From: Patrice Riemens <patrice@xs4all.nl> To: nettime-l@kein.org Subject: <nettime> Pascal Zachary: Rules for the Digital Panopticon (IEEE) Message-ID: <0adf8f7abff38f778a06f8b776729759.squirrel@webmail.xs4all.nl> User-Agent: SquirrelMail/1.4.18 Reply-To: a moderated mailing list for net criticism <nettime-l@mail.kein.org> original to: http://spectrum.ieee.org/computing/software/rules-for-the-digital-panopticon Rules for the Digital Panopticon The technologies of persistent surveillance can protect us only if certain boundaries are respected By G. Pascal Zachary (Posted 20 Sep 2013)
For centuries, we humans have lacked the all-knowing, all-seeing mechanisms to credibly predict and prevent bad actions by others. Now these very powers of preemption are perhaps within our grasp, thanks to a confluence of technologies.
In the foreseeable future, governments, and perhaps some for-profit corporations and civil-society groups, will design, construct, and deploy surveillance systems that aim to predict and prevent bad actions and to identify, track, and neutralize people who commit them.
And when contemplating these systems, lets broadly agree that we should prevent the slaughter of children at school and the abduction, rape, and imprisonment of women. And lets also agree that we should thwart lethal attacks against lawful government.
Sorry, but I can't agree with that last statement. "Lawful government"? Which government would be willing to admit that it isn't 'lawful'? The only government I would consider 'lawful' is one which complies with libertarianism's 'Non initiation of force/fraud principle', but since I am aware of no such government, I cannot agree that this statement has any practical purpose. And while I might wryly agree with it, it would only be on the condition that all employees and officeholders of real (non-compliant with NIOFFP) government surrender, resign, and return every penny of money paid their for their 'services', back to 'day 1' of their employment. And, of course, compensating all victims of that government for their damages and suffering. Indoctrinated with the idea that they had the right to do what they did, I doubt that any of them would comply.
participants (2)
-
Eugen Leitl
-
Jim Bell