Re: [funsec] Re: AT&T's database of 1.92 trillion phone calls (Sprint does it too, and i'm sure they aren't the only ones)
On 2/27/06, Brian Loe <knobdy@gmail.com> wrote:
... This kind of thing doesn't scare me. What they wind up doing with it, at times, does scare me. What scares me even more is that no one will ever do anything about it on a scale that matters.
i'd have to agree, with regards to corporate or government entities making strong individual privacy a priority _on their own accord_ with this kind of applied information technology.[1] so the only feasible solution is empowering users to take responsibility for their own information security and privacy. if "johnny can't encrypt"[2] this is a very tall order indeed[3]. what would the ideal minimum amount of information exposed consist of if you could apply usable security/encryption and privacy enhancing technologies to the usual communications today (voice, text, video, data)? - no content of payloads, due to end to end encryption - strong anonymous mix networks for non interactive messaging - weakly anonymous low latency onion/relay networks for near real time messaging - seamless wireless and sneaker net support to offload locally/out of band whenever possible you'd still be exposing: - location of endpoints used (except in the strong and latent mix scenario perhaps) - distinct parties involved (social network analysis) - volume of encrypted traffic exchanged i suppose the real question is how long would it take to design and implement (and the hardware to support it prevalent for all users). 5 years seems extremely optimistic given the difficulties involved. [and i suppose this also means the paranoid will all become proficient TSCM technicians.] ah, we can dream :) until then, the fraction of unusual end lusers making use of strong privacy enhancing technologies will be a function of how annoying they are to use vs. how annoying the government privacy invasion programs become. single digits for the near future... --- [1.] "DoJ strikes back against Google (your privacy concerns are unfounded (lol))" http://www.theinquirer.net/?article=29918 [2.] "NPR : E-Mail Encryption Rare in Everyday Use" http://www.mail-archive.com/cryptography@metzdowd.com/msg05769.html [3.] " User Interaction Design for Secure Systems" http://www.cs.berkeley.edu/~pingster/sec/uid/ MANDATORY REQUIREMENTS:: A. Path of Least Resistance. The most natural way to do any task should also be the most secure way. B. Appropriate Boundaries. The interface should expose, and the system should enforce, distinctions between objects and between actions along boundaries that matter to the user. C. Explicit Authorization. A user's authorities must only be provided to other actors as a result of an explicit user action that is understood to imply granting. D. Visibility. The interface should allow the user to easily review any active actors and authority relationships that would affect security-relevant decisions. E. Revocability. The interface should allow the user to easily revoke authorities that the user has granted, wherever revocation is possible. F. Trusted Path. The interface must provide an unspoofable and faithful communication channel between the user and any entity trusted to manipulate authorities on the user's behalf. G. Identifiability. The interface should enforce that distinct objects and distinct actions have unspoofably identifiable and distinguishable representations.
Fwd: discussion on enabling and motivating end users to assume responsbility for their own information security/privacy over the communication and computing resources they use. can we go ahead and state as fact that a capability model tied to a pet name pattern / sticky note metaphor is required for strong least privelege which in turn is mandatory for the secure user interface / interaction requirements mentioned below? [if you don't think caps and pet names should be mandatory, can you provide a reasonable explanation of how key based capabilities and pets names are less secure than the alternative you are describing?] wow, that's a lot work to describe in detail (design) let alone even attempt to implement. (at least if you designed and implemented it right once you should never need to implement again) ---------- Forwarded message ---------- From: coderman <coderman@gmail.com> Date: Feb 27, 2006 9:05 AM Subject: Re: [funsec] Re: AT&T's database of 1.92 trillion phone calls (Sprint does it too, and i'm sure they aren't the only ones) To: Brian Loe <knobdy@gmail.com> Cc: funsec@linuxbox.org, cypherpunks@jfet.org On 2/27/06, Brian Loe <knobdy@gmail.com> wrote:
... This kind of thing doesn't scare me. What they wind up doing with it, at times, does scare me. What scares me even more is that no one will ever do anything about it on a scale that matters.
i'd have to agree, with regards to corporate or government entities making strong individual privacy a priority _on their own accord_ with this kind of applied information technology.[1] so the only feasible solution is empowering users to take responsibility for their own information security and privacy. if "johnny can't encrypt"[2] this is a very tall order indeed[3]. what would the ideal minimum amount of information exposed consist of if you could apply usable security/encryption and privacy enhancing technologies to the usual communications today (voice, text, video, data)? - no content of payloads, due to end to end encryption - strong anonymous mix networks for non interactive messaging - weakly anonymous low latency onion/relay networks for near real time messaging - seamless wireless and sneaker net support to offload locally/out of band whenever possible you'd still be exposing: - location of endpoints used (except in the strong and latent mix scenario perhaps) - distinct parties involved (social network analysis) - volume of encrypted traffic exchanged i suppose the real question is how long would it take to design and implement (and the hardware to support it prevalent for all users). 5 years seems extremely optimistic given the difficulties involved. [and i suppose this also means the paranoid will all become proficient TSCM technicians.] ah, we can dream :) until then, the fraction of unusual end lusers making use of strong privacy enhancing technologies will be a function of how annoying they are to use vs. how annoying the government privacy invasion programs become. single digits for the near future... --- [1.] "DoJ strikes back against Google (your privacy concerns are unfounded (lol))" http://www.theinquirer.net/?article=29918 [2.] "NPR : E-Mail Encryption Rare in Everyday Use" http://www.mail-archive.com/cryptography@metzdowd.com/msg05769.html [3.] " User Interaction Design for Secure Systems" http://www.cs.berkeley.edu/~pingster/sec/uid/ MANDATORY REQUIREMENTS:: A. Path of Least Resistance. The most natural way to do any task should also be the most secure way. B. Appropriate Boundaries. The interface should expose, and the system should enforce, distinctions between objects and between actions along boundaries that matter to the user. C. Explicit Authorization. A user's authorities must only be provided to other actors as a result of an explicit user action that is understood to imply granting. D. Visibility. The interface should allow the user to easily review any active actors and authority relationships that would affect security-relevant decisions. E. Revocability. The interface should allow the user to easily revoke authorities that the user has granted, wherever revocation is possible. F. Trusted Path. The interface must provide an unspoofable and faithful communication channel between the user and any entity trusted to manipulate authorities on the user's behalf. G. Identifiability. The interface should enforce that distinct objects and distinct actions have unspoofably identifiable and distinguishable representations.
participants (1)
-
coderman