CRYPTO-GRAM, July 15, 2007
CRYPTO-GRAM July 15, 2007 by Bruce Schneier Founder and CTO BT Counterpane schneier@schneier.com http://www.schneier.com http://www.counterpane.com A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>. You can read this issue on the web at <http://www.schneier.com/crypto-gram-0707.html>. These same essays appear in the "Schneier on Security" blog: <http://www.schneier.com/blog>. An RSS feed is available. ** *** ***** ******* *********** ************* In this issue: Correspondent Inference Theory and Terrorism TSA and the Sippy Cup Incident News Ubiquity of Communication 4th Amendment Rights Extended to E-Mail Credit Card Gas Limits Schneier/BT Counterpane News Designing Voting Machines to Minimize Coercion Risks of Data Reuse Comments from Readers ** *** ***** ******* *********** ************* Correspondent Inference Theory and Terrorism Two people are sitting in a room together: an experimenter and a subject. The experimenter gets up and closes the door, and the room becomes quieter. The subject is likely to believe that the experimenter's purpose in closing the door was to make the room quieter. This is an example of correspondent inference theory. People tend to infer the motives -- and also the disposition -- of someone who performs an action based on the effects of his actions, and not on external or situational factors. If you see someone violently hitting someone else, you assume it's because he wanted to -- and is a violent person -- and not because he's play-acting. If you read about someone getting into a car accident, you assume it's because he's a bad driver and not because he was simply unlucky. And -- more importantly for this column -- if you read about a terrorist, you assume that terrorism is his ultimate goal. It's not always this easy, of course. If someone chooses to move to Seattle instead of New York, is it because of the climate, the culture or his career? Edward Jones and Keith Davis, who advanced this theory in the 1960s and 1970s, proposed a theory of "correspondence" to describe the extent to which this effect predominates. When an action has a high correspondence, people tend to infer the motives of the person directly from the action: e.g., hitting someone violently. When the action has a low correspondence, people tend to not to make the assumption: e.g., moving to Seattle. Like most cognitive biases, correspondent inference theory makes evolutionary sense. In a world of simple actions and base motivations, it's a good rule of thumb that allows a creature to rapidly infer the motivations of another creature. (He's attacking me because he wants to kill me.) Even in sentient and social creatures like humans, it makes a lot of sense most of the time. If you see someone violently hitting someone else, it's reasonable to assume that he's a violent person. Cognitive biases aren't bad; they're sensible rules of thumb. But like all cognitive biases, correspondent inference theory fails sometimes. And one place it fails pretty spectacularly is in our response to terrorism. Because terrorism often results in the horrific deaths of innocents, we mistakenly infer that the horrific deaths of innocents is the primary motivation of the terrorist, and not the means to a different end. I found this interesting analysis in a paper by Max Abrahms in "International Security." "Why Terrorism Does Not Work" analyzes the political motivations of 28 terrorist groups: the complete list of "foreign terrorist organizations" designated by the U.S. Department of State since 2001. He lists 42 policy objectives of those groups, and found that they only achieved them 7 percent of the time. According to the data, terrorism is more likely to work if 1) the terrorists attack military targets more often than civilian ones, and 2) if they have minimalist goals like evicting a foreign power from their country or winning control of a piece of territory, rather than maximalist objectives like establishing a new political system in the country or annihilating another nation. But even so, terrorism is a pretty ineffective means of influencing policy. There's a lot to quibble about in Abrahms' methodology, but he seems to be erring on the side of crediting terrorist groups with success. (Hezbollah's objectives of expelling both peacekeepers and Israel out of Lebanon counts as a success, but so does the "limited success" by the Tamil Tigers of establishing a Tamil state.) Still, he provides good data to support what was until recently common knowledge: Terrorism doesn't work. This is all interesting stuff, and I recommend that you read the paper for yourself. But to me, the most insightful part is when Abrahms uses correspondent inference theory to explain why terrorist groups that primarily attack civilians do not achieve their policy goals, even if they are minimalist. Abrahms writes: "The theory posited here is that terrorist groups that target civilians are unable to coerce policy change because terrorism has an extremely high correspondence. Countries believe that their civilian populations are attacked not because the terrorist group is protesting unfavorable external conditions such as territorial occupation or poverty. Rather, target countries infer the short-term consequences of terrorism -- the deaths of innocent civilians, mass fear, loss of confidence in the government to offer protection, economic contraction, and the inevitable erosion of civil liberties -- (are) the objects of the terrorist groups. In short, target countries view the negative consequences of terrorist attacks on their societies and political systems as evidence that the terrorists want them destroyed. Target countries are understandably skeptical that making concessions will placate terrorist groups believed to be motivated by these maximalist objectives." In other words, terrorism doesn't work, because it makes people less likely to acquiesce to the terrorists' demands, no matter how limited they might be. The reaction to terrorism has an effect completely opposite to what the terrorists want; people simply don't believe those limited demands are the actual demands. This theory explains, with a clarity I have never seen before, why so many people make the bizarre claim that al Qaeda terrorism -- or Islamic terrorism in general -- is "different": that while other terrorist groups might have policy objectives, al Qaeda's primary motivation is to kill us all. This is something we have heard from President Bush again and again -- Abrahms has a page of examples in the paper -- and is a rhetorical staple in the debate. In fact, Bin Laden's policy objectives have been surprisingly consistent. Abrahms lists four; here are six from former CIA analyst Michael Scheuer's book "Imperial Hubris": * End U.S. support of Israel * Force American troops out of the Middle East, particularly Saudi Arabia * End the U.S. occupation of Afghanistan and (subsequently) Iraq * End U.S. support of other countries' anti-Muslim policies * End U.S. pressure on Arab oil companies to keep prices low * End U.S. support for "illegitimate" (i.e. moderate) Arab governments, like Pakistan Although Bin Laden has complained that Americans have completely misunderstood the reason behind the 9/11 attacks, correspondent inference theory postulates that he's not going to convince people. Terrorism, and 9/11 in particular, has such a high correspondence that people use the effects of the attacks to infer the terrorists' motives. In other words, since Bin Laden caused the death of a couple of thousand people in the 9/11 attacks, people assume that must have been his actual goal, and he's just giving lip service to what he *claims* are his goals. Even Bin Laden's actual objectives are ignored as people focus on the deaths, the destruction and the economic impact. Perversely, Bush's misinterpretation of terrorists' motives actually helps prevent them from achieving their goals. None of this is meant to either excuse or justify terrorism. In fact, it does the exact opposite, by demonstrating why terrorism doesn't work as a tool of persuasion and policy change. But we're more effective at fighting terrorism if we understand that it is a means to an end and not an end in itself; it requires us to understand the true motivations of the terrorists and not just their particular tactics. And the more our own cognitive biases cloud that understanding, the more we mischaracterize the threat and make bad security trade-offs. http://www.mitpressjournals.org/doi/pdf/10.1162/isec.2006.31.2.42 http://en.wikipedia.org/wiki/Correspondent_inference_theory Cognitive biases: http://www.healthbolt.net/2007/02/14/26-reasons-what-you-think-is-right-is-w... or http://tinyurl.com/2oo5nk This essay originally appeared on Wired.com: http://www.wired.com/politics/security/commentary/securitymatters/2007/07/se... or http://tinyurl.com/3y322f ** *** ***** ******* *********** ************* TSA and the Sippy Cup Incident This story is pretty disgusting: "I demanded to speak to a TSA [Transportation Security Administration] supervisor who asked me if the water in the sippy cup was 'nursery water or other bottled water.' I explained that the sippy cup water was filtered tap water. The sippy cup was seized as my son was pointing and crying for his cup. I asked if I could drink the water to get the cup back, and was advised that I would have to leave security and come back through with an empty cup in order to retain the cup. As I was escorted out of security by TSA and a police officer, I unscrewed the cup to drink the water, which accidentally spilled because I was so upset with the situation. "At this point, I was detained against my will by the police officer and threatened to be arrested for endangering other passengers with the spilled 3 to 4 ounces of water. I was ordered to clean the water, so I got on my hands and knees while my son sat in his stroller with no shoes on since they were also screened and I had no time to put them back on his feet. I asked to call back my fianci, who I could still see from afar, waiting for us to clear security, to watch my son while I was being detained, and the officer threatened to arrest me if I moved. So I yelled past security to get the attention of my fianci. "I was ordered to apologize for the spilled water, and again threatened arrest. I was threatened several times with arrest while detained, and while three other police officers were called to the scene of the mother with the 19 month old. A total of four police officers and three TSA officers reported to the scene where I was being held against my will. I was also told that I should not disrespect the officer and could be arrested for this too. I apologized to the officer and she continued to detain me despite me telling her that I would miss my flight. The officer advised me that I should have thought about this before I 'intentionally spilled the water!'" This story portrays the TSA as jack-booted thugs. The story hit the Internet in mid-June, and quickly made the rounds. I saw it on BoingBoing. But, as it turns out, it's not entirely true. The TSA has a webpage up, with both the incident report and video. "TSO [REDACTED] took the female to the exit lane with the stroller and her bag. When she got past the exit lane podium she opened the child's drink container and held her arm out and poured the contents (approx. 6 to 8 ounces) on the floor. MWAA Officer [REDACTED] was manning the exit lane at the time and observed the entire scene and approached the female passenger after observing this and stopped her when she tried to re-enter the sterile area after trying to come back through after spilling the fluids on the floor. The female passenger flashed her badge and credentials and told the MWAA officer 'Do you know who I am?' An argument then ensued between the officer and the passenger of whether the spilling of the fluid was intentional or accidental. Officer [REDACTED] asked the passenger to clean up the spill and she did." Watch the second video. TSO [REDACTED] is partially blocking the scene, but at 2:01:00 PM it's pretty clear that Monica Emmerson -- that's the female passenger -- spills the liquid on the floor on purpose, as a deliberate act of defiance. What happens next is more complicated; you can watch it for yourself, or you can read BoingBoing's somewhat sarcastic summary. In this instance, the TSA is clearly in the right. But there's a larger lesson here. Remember the Princeton professor who was put on the watch list for criticizing Bush? That was also untrue. Why is it that we all -- myself included -- believe these stories? Why are we so quick to assume that the TSA is a bunch of jack-booted thugs, officious and arbitrary and drunk with power? It's because everything seems so arbitrary, because there's no accountability or transparency in the DHS. Rules and regulations change all the time, without any explanation or justification. Of course this kind of thing induces paranoia. It's the sort of thing you read about in history books about East Germany and other police states. It's not what we expect out of 21st century America. The problem is larger than the TSA, but the TSA is the part of "homeland security" that the public comes into contact with most often -- at least the part of the public that writes about these things most. They're the public face of the problem, so of course they're going to get the lion's share of the finger pointing. It was smart public relations on the TSA's part to get the video of the incident on the Internet quickly, but it would be even smarter for the government to restore basic constitutional liberties to our nation's counterterrorism policy. Accountability and transparency are basic building blocks of any democracy; and the more we lose sight of them, the more we lose our way as a nation. The story: http://www.nowpublic.com/nightmare_at_reagan_national_airport_a_security_sto... or http://tinyurl.com/2vgvcm http://www.boingboing.net/2007/06/14/tsa_detains_woman_ov.html The TSA's rebuttal: http://www.tsa.gov/approach/mythbusters/dca_incident.shtm http://www.boingboing.net/2007/06/15/tsa_denies_sippy_cup.html Princeton professor: http://rawstory.com/news/2007/Professor_who_criticized_Bush_added_to_0409.ht... or http://tinyurl.com/yo7ljc http://blog.wired.com/27bstroke6/2007/04/debunking_the_p.html ** *** ***** ******* *********** ************* News Remote sensing of meth labs, another NSF grant: http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0712406 Ridiculous "age verification" for online movie trailers: "It seems like 'We want to protect children' really means, We want to give the appearance that we've made an effort to protect children. If they really wanted to protect children, they wouldn't use the honor system as the sole safeguard standing between previews filled with sex and violence and Internet-savvy kids who can, in a matter of seconds, beat the impotent little system." http://blogs.csoonline.com/dirty_trailers_cheap_tricks Direct marketing meets wholesale surveillance: a $100K National Science Foundation grant: http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0712287 In 1748, the painter William Hogarth was arrested as a spy for sketching fortifications at Calais. http://en.wikipedia.org/wiki/The_Gate_of_Calais Sound familiar, doesn't it? http://www.schneier.com/blog/archives/2005/07/security_risks_3.html http://www.schneier.com/blog/archives/2007/04/how_australian.html http://www.flickr.com/groups/strobist/discuss/72157600359124224/ Fogshield: silly home security. http://hardwareaisle.thisoldhouse.com/2007/06/lets_smoke_em_o.html http://www.schneier.com/blog/archives/2007/06/silly_home_secu.html Someone claims to have hacked the Bloomsbury Publishing network, and has posted what he says is the ending to the last Harry Potter book. I don't believe it, actually. Sure, it's possible -- probably even easy. But the posting just doesn't read right to me. And I would expect someone who really got their hands on a copy of the manuscript to post the choice bits of text, not just a plot summary. It's easier, and it's more proof. http://seclists.org/fulldisclosure/2007/Jun/0380.html The French government wants to ban BlackBerry e-mail devices, because of worries of eavesdropping by U.S. intelligence. http://www.ft.com/cms/s/dde45086-1e97-11dc-bc22-000b5df10621,_i_rssPage=61e2... or http://tinyurl.com/yvka3p Vulnerabilities in the DHS network: http://blog.wired.com/27bstroke6/2007/06/dhs-security-ch.html TSA uses Monte Carlo simulations to weigh airplane risks http://www.gcn.com/print/26_13/44398-1.html Good comments in the blog post: http://www.schneier.com/blog/archives/2007/06/tsa_uses_monte.html The Onion on terrorist cell apathy: http://www.theonion.com/content/news/after_5_years_in_u_s_terrorist "Cocktail condoms" are protective covers that go over your drink and "protect" against someone trying to slip a Mickey Finn (or whatever they're called these days). I'm sure there are many ways to defeat this security device if you're so inclined: a syringe, affixing a new cover after you tamper with the drink, and so on. And this is exactly the sort of rare risk we're likely to overreact to. But to me, the most interesting aspect of this story is the agenda. If these things become common, it won't be because of security. It will be because of advertising http://abcnews.go.com/US/story?id=3302652&page=1&CMP=OTC-RSSFeeds0312 Does this cell phone stalking story seem real to anyone? http://www.thenewstribune.com/front/topphoto/story/91460.html http://consumerist.com/consumer/privacy/family-stalked-using-cellphone-snoop... or http://tinyurl.com/2kklxb There's something going on here, but I just don't believe it's entirely cell phone hacking. Something else is going on. Really good "Washington Post" article on secrecy: http://www.washingtonpost.com/wp-dyn/content/article/2007/06/08/AR2007060802... or http://tinyurl.com/yv7bjd Back in 2002 I wrote about the relationship between secrecy and security. http://www.schneier.com/crypto-gram-0205.html#1 Surveillance cameras that obscure faces, an interesting privacy-enhancing technology. http://www.technologyreview.com/Infotech/18617/ At the beach, sand is more deadly than sharks. And this is important enough to become someone's crusade? http://abcnews.go.com/US/wireStory?id=3299749 Essay: "The only thing we have to fear is the 'culture of fear' itself," by Frank Furedi. http://www.frankfuredi.com/pdf/fearessay-20070404.pdf Making invisible ink printer cartridges: a covert channel. http://gizmodo.com/gadgets/clips/how-to-make-glow+in+the+dark-printer-ink-26... or http://tinyurl.com/yoszvc Bioterrorism detection systems and false alarms: http://www.google.com/search?q=cache:sfmQXOplWaUJ:www.the-scientist.com/arti... or http://tinyurl.com/2tjmhy Robotic guns: http://defensenews.com/story.php?F=2803275&C=america Airport security: Israel vs. the United States http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2007/06/17/TRGRJQF1DE1.DTL or http://tinyurl.com/yqdt6f Why an ATM PIN has four digits: http://news.bbc.co.uk/2/hi/business/6230194.stm Security cartoon: it's always a trade-off: http://www.gocomics.com/nonsequitur/2007/06/24 Look at the last line of this article, about an Ohio town considering mandatory school uniforms in lower grades: "For Edgewood, the primary motivation for adopting uniforms would be to enhance school security, York said." What is he talking about? Does he think that school uniforms enhance security because it would be easier to spot non-uniform-wearing non-students in the school building and on the grounds? (Of course, non-students with uniforms would have an easier time sneaking in.) Or something else? Or is security just an excuse for any random thing these days? http://news.enquirer.com/apps/pbcs.dll/article?AID=/20070626/NEWS01/30626003... or http://tinyurl.com/2yr2z8 or http://tinyurl.com/253j8l Good commentaries on the UK terrorist plots: http://www.theregister.co.uk/2007/06/29/more_fear_biscuits_please/ http://www.theage.com.au/news/opinion/its-hard-to-prevent-the-hard-to-imagin... or http://tinyurl.com/2dvcyv http://www.theregister.co.uk/2007/07/02/terror_idiocy_outbreak/ http://www.slate.com/id/2169614/nav/tap1/ http://www.atimes.com/atimes/Front_Page/IG03Aa01.html http://www.theregister.co.uk/2007/07/04/ec_frattini_web_terror_dunce_cap/ or http://tinyurl.com/35ebmj In former East Germany, the Stazi kept samples of people's smells. http://www.kirchersociety.org/blog/2007/04/05/smell-jars-of-the-stasi/ The Millwall brick: an improvised weapon made out of newspaper, favored by football (i.e., soccer) hooligans. http://en.wikipedia.org/wiki/Millwall_brick When coins are worth more as metal than as coins. http://news.bbc.co.uk/2/hi/south_asia/6766563.stm This guy has a bottle taken away from him, then he picks it out of the trash and takes it on the plane anyway. I'm not sure whether this is more gutsy or stupid. If he had been caught, the TSA would have made his day pretty damn miserable. I'm not even sure bragging about it online is a good idea. Too many idiots in the FBI. http://www.zug.com/gab/index.cgi?func=view_thread&head=1&thread_id=74827 or http://tinyurl.com/yuk2ky I've written about this Greek wiretapping scandal before. A system to allow the police to eavesdrop on conversations was abused (surprise, surprise). There's a really good technical analysis in IEEE Spectrum this month. http://www.spectrum.ieee.org/print/5280 Commentaries: http://www.crypto.com/blog/hellenic_eavesdropping/ http://www.cs.columbia.edu/~smb/blog/2007-07/2007-07-06.html http://mobile.nytimes.com/blogs/bits/212 Police don't overreact to strange object. What's sad is that it feels like an exception. http://www.dallasnews.com/sharedcontent/dws/dn/latestnews/stories/071007dnme... or http://tinyurl.com/yrys8p I'm sure glad the Australian Federal Police have their priorities straight: "Technology such as cloned part-robot humans used by organised crime gangs pose the greatest future challenge to police, along with online scamming, Australian Federal Police (AFP) Commissioner Mick Keelty says." http://www.theage.com.au/news/national/top-cop-predicts-robot-crimewave/2007... or http://tinyurl.com/27y45n Dan Solove comments on the recent ACLU vs. NSA decision regarding the NSA's illegal wiretapping activities. http://www.concurringopinions.com/archives/2007/07/aclu_v_nsa.html http://www.concurringopinions.com/archives/2007/07/aclu_v_nsa_and.html Dan Solove on privacy and the "nothing to hide" argument: http://ssrn.com/abstract=998565 Funny airport-security photo: http://www.flickr.com/photos/9831094@N02/755509753/ ** *** ***** ******* *********** ************* Ubiquity of Communication In an essay by Randy Farmer, a pioneer of virtual online worlds, he describes communication in something called Disney's ToonTown. Designers of online worlds for children wanted to severely restrict the communication that users could have with each other, lest somebody say something that's inappropriate for children to hear. Randy discusses various approaches to this problem that were tried over the years. The ToonTown solution was to restrict users to something called "Speedchat," a menu of pre-constructed sentences, all innocuous. They also gave users the ability to conduct unrestricted conversations with each other, provided they both knew a secret code string. The designers presumed the code strings would be passed only to people a user knew in real life, perhaps on a school playground or among neighbors. Users found ways to pass code strings to strangers anyway. Users invented several protocols, using gestures, canned sentences, or movement of objects in the game. Randy writes: "By hook, or by crook, customers will always find a way to connect with each other." http://www.fudco.com/habitat/archives/000058.html http://www.disneyonlineworlds.com/index.php/Becoming_Secret_Friends_with_som... or http://tinyurl.com/2gkdlx ** *** ***** ******* *********** ************* 4th Amendment Rights Extended to E-Mail This is a great piece of news in the U.S. For the first time, e-mail has been granted the same constitutional protections as telephone calls and personal papers: the police need a warrant to get at it. Now it's only a circuit court decision -- the Sixth U.S. Circuit Court of Appeals in Ohio -- it's pretty narrowly defined based on the attributes of the e-mail system, and it has a good chance of being overturned by the Supreme Court...but it's still great news. The way to think of the warrant system is as a security device. The police still have the ability to get access to e-mail in order to investigate a crime. But in order to prevent abuse, they have to convince a neutral third party -- a judge -- that accessing someone's e-mail is necessary to investigate that crime. That judge, at least in theory, protects our interests. Clearly e-mail deserves the same protection as our other personal papers, but -- like phone calls -- it might take the courts decades to figure that out. But we'll get there eventually. http://blog.wired.com/27bstroke6/2007/06/appeals_court_s.html http://arstechnica.com/news.ars/post/20070619-appeals-court-feds-cant-seize-... or http://tinyurl.com/26maek http://www.freedom-to-tinker.com/?p=1170 http://www.volokh.com/archives/archive_2007_06_17-2007_06_23.shtml#118220816... or http://tinyurl.com/yqb4uz http://www.ca6.uscourts.gov/opinions.pdf/07a0225p-06.pdf ** *** ***** ******* *********** ************* Credit Card Gas Limits Here's an interesting phenomenon: rising gas costs have pushed up a lot of legitimate transactions to the "anti-fraud" ceiling. Security is a trade-off, and now the ceiling is annoying more and more legitimate gas purchasers. But to me the real question is: does this ceiling have any actual security purpose? In general, credit card fraudsters like making gas purchases because the system is automated: no signature is required, and there's no need to interact with any other person. In fact, buying gas is the most common way a fraudster tests that a recently stolen card is valid. The anti-fraud ceiling doesn't actually prevent any of this, but limits the amount of money at risk. But so what? How many perps are actually trying to get more gas than is permitted? Are credit-card-stealing miscreants also swiping cars with enormous gas tanks, or merely filling up the passenger cars they regularly drive? I'd love to know how many times, prior to the run-up in gas prices, a triggered cutoff actually coincided with a subsequent report of a stolen card. And what's the effect of a ceiling, apart from a gas shut-off? Surely the smart criminals know about smurfing, if they need more gas than the ceiling will allow. The Visa spokesperson said, "We get more calls, questions, when gas prices increase." He/she didn't say: "We *make* more calls to see if fraud is occurring." So the only inquiries made may be in the cases where fraud isn't occurring. http://www.sfgate.com/cgi-bin/article.cgi?f=/n/a/2007/06/15/financial/f11062... or http://tinyurl.com/ywfqdj Smurfing: http://en.wikipedia.org/wiki/Smurfing_%28crime%29 ** *** ***** ******* *********** ************* Schneier/BT Counterpane News Slate wrote an article on my movie-plot threat contest. http://www.slate.com/id/2169232/ ** *** ***** ******* *********** ************* Designing Voting Machines to Minimize Coercion If someone wants to buy your vote, he'd like some proof that you've delivered the goods. Camera phones are one way for you to prove to your buyer that you voted the way he wants. Belgian voting machines have been designed to minimize that risk. "Once you have confirmed your vote, the next screen doesn't display how you voted. So if one is coerced and has to deliver proof, one just has to take a picture of the vote one was coerced into, and then back out from the screen and change ones vote. The only workaround I see is for the coercer to demand a video of the complete voting process, instead of a picture of the ballot." The author is wrong that this is an advantage electronic ballots have over paper ballots. Paper voting systems can be designed with the same security features. http://didierstevens.wordpress.com/2007/06/11/some-e-voting-observations/ or http://tinyurl.com/24k5l6 ** *** ***** ******* *********** ************* Risks of Data Reuse We learned the news in March: Contrary to decades of denials, the U.S. Census Bureau used individual records to round up Japanese-Americans during World War II. The Census Bureau normally is prohibited by law from revealing data that could be linked to specific individuals; the law exists to encourage people to answer census questions accurately and without fear. And while the Second War Powers Act of 1942 temporarily suspended that protection in order to locate Japanese-Americans, the Census Bureau had maintained that it only provided general information about neighborhoods. New research proves they were lying. The whole incident serves as a poignant illustration of one of the thorniest problems of the information age: data collected for one purpose and then used for another, or "data reuse." When we think about our personal data, what bothers us most is generally not the initial collection and use, but the secondary uses. I personally appreciate it when Amazon.com suggests books that might interest me, based on books I have already bought. I like it that my airline knows what type of seat and meal I prefer, and my hotel chain keeps records of my room preferences. I don't mind that my automatic road-toll collection tag is tied to my credit card, and that I get billed automatically. I even like the detailed summary of my purchases that my credit card company sends me at the end of every year. What I don't want, though, is any of these companies selling that data to brokers, or for law enforcement to be allowed to paw through those records without a warrant. There are two bothersome issues about data reuse. First, we lose control of our data. In all of the examples above, there is an implied agreement between the data collector and me: It gets the data in order to provide me with some sort of service. Once the data collector sells it to a broker, though, it's out of my hands. It might show up on some telemarketer's screen, or in a detailed report to a potential employer, or as part of a data-mining system to evaluate my personal terrorism risk. It becomes part of my data shadow, which always follows me around but I can never see. This, of course, affects our willingness to give up personal data in the first place. The reason U.S. census data was declared off-limits for other uses was to placate Americans' fears and assure them that they could answer questions truthfully. How accurate would you be in filling out your census forms if you knew the FBI would be mining the data, looking for terrorists? How would it affect your supermarket purchases if you knew people were examining them and making judgments about your lifestyle? I know many people who engage in data poisoning: deliberately lying on forms in order to propagate erroneous data. I'm sure many of them would stop that practice if they could be sure that the data was only used for the purpose for which it was collected. The second issue about data reuse is error rates. All data has errors, and different uses can tolerate different amounts of error. The sorts of marketing databases you can buy on the web, for example, are notoriously error-filled. That's OK; if the database of ultra-affluent Americans of a particular ethnicity you just bought has a 10 percent error rate, you can factor that cost into your marketing campaign. But that same database, with that same error rate, might be useless for law enforcement purposes. Understanding error rates and how they propagate is vital when evaluating any system that reuses data, especially for law enforcement purposes. A few years ago, the Transportation Security Administration's follow-on watch list system, Secure Flight, was going to use commercial data to give people a terrorism risk score and determine how much they were going to be questioned or searched at the airport. People rightly rebelled against the thought of being judged in secret, but there was much less discussion about whether the commercial data from credit bureaus was accurate enough for this application. An even more egregious example of error-rate problems occurred in 2000, when the Florida Division of Elections contracted with Database Technologies (since merged with ChoicePoint) to remove convicted felons from the voting rolls. The databases used were filled with errors and the matching procedures were sloppy, which resulted in thousands of disenfranchised voters -- mostly black -- and almost certainly changed a presidential election result. Of course, there are beneficial uses of secondary data. Take, for example, personal medical data. It's personal and intimate, yet valuable to society in aggregate. Think of what we could do with a database of everyone's health information: massive studies examining the long-term effects of different drugs and treatment options, different environmental factors, different lifestyle choices. There's an enormous amount of important research potential hidden in that data, and it's worth figuring out how to get at it without compromising individual privacy. This is largely a matter of legislation. Technology alone can never protect our rights. There are just too many reasons not to trust it, and too many ways to subvert it. Data privacy ultimately stems from our laws, and strong legal protections are fundamental to protecting our information against abuse. But at the same time, technology is still vital. Both the Japanese internment and the Florida voting-roll purge demonstrate that laws can change -- and sometimes change quickly. We need to build systems with privacy-enhancing technologies that limit data collection wherever possible. Data that is never collected cannot be reused. Data that is collected anonymously, or deleted immediately after it is used, is much harder to reuse. It's easy to build systems that collect data on everything -- it's what computers naturally do -- but it's far better to take the time to understand what data is needed and why, and only collect that. History will record what we, here in the early decades of the information age, did to foster freedom, liberty and democracy. Did we build information technologies that protected people's freedoms even during times when society tried to subvert them? Or did we build technologies that could easily be modified to watch and control? It's bad civic hygiene to build an infrastructure that can be used to facilitate a police state. Individual data and the Japanese internment: http://www.sciam.com/article.cfm?articleID=A4F4DED6-E7F2-99DF-32E46B0AC1FDE0FE&sc=I100322 or http://tinyurl.com/33kcy3 http://www.usatoday.com/news/nation/2007-03-30-census-role_N.htm http://www.homelandstupidity.us/2007/04/05/census-bureau-gave-up-wwii-intern... or http://tinyurl.com/2haky8 http://rawstory.com/news/afp/Census_identified_Japanese_American_03302007.ht... or http://tinyurl.com/2ctnl3 Marketing databases: http://www.wholesalelists.net http://www.usdatacorporation.com/pages/specialtylists.html Secure Flight: http://www.epic.org/privacy/airtravel/secureflight.html Florida disenfranchisement in 2000: http://www.thenation.com/doc/20010430/lantigua This article originally appeared on Wired.com: http://www.wired.com/politics/onlinerights/commentary/securitymatters/2007/0... or http://tinyurl.com/34mr2g ** *** ***** ******* *********** ************* Comments from Readers There are hundreds of comments -- many of them interesting -- on these topics on my blog. Search for the story you want to comment on, and join in. http://www.schneier.com/blog ** *** ***** ******* *********** ************* CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL. Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety. CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers "Beyond Fear," "Secrets and Lies," and "Applied Cryptography," and an inventor of the Blowfish and Twofish algorithms. He is founder and CTO of BT Counterpane, and is a member of the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>. BT Counterpane is the world's leading protector of networked information - the inventor of outsourced security monitoring and the foremost authority on effective mitigation of emerging IT threats. BT Counterpane protects networks for Fortune 1000 companies and governments world-wide. See <http://www.counterpane.com>. Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT or BT Counterpane. Copyright (c) 2007 by Bruce Schneier. ----- End forwarded message ----- -- Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org ______________________________________________________________ ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org 8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE
participants (1)
-
Bruce Schneier