At 8:37 AM 1/9/96, Lucky Green wrote:
Very true. But why does it always seem to take an exploitable crack before companies pay attention to security flaws? Is it because they are unable to admit that they have made a mistake? Everybody makes mistakes. What's the big deal? I really don't understand it. Any psychologists on this list?
I'm not a psychologist, though I doubt that would help. (Having had a girlfriend who was one, she had no special knowledge about corporate motivations...) Companies are pyramids, with a flood of signals flowing up and down the pyramid. Few of the signals are truly important, most are just noise. Hence the difficulty with corporations responding to crises. When a confirmation of a serious problem is made--a building collapses, a floating point bug is found in a chip, a random number generator is found to be flawed, etc.--then there is little doubt that a real problem exists, or at least that a public relations problem must be dealt with. Therefore, a flurry of corporate activity ensues, task forces are created, press releases issued, etc. I'm neither surprised nor disheartened by this. It often takes hitting a company over the head with a two-by-four..."to get their attention." (I saw this many times at Intel, and they were ahead of most of their rivals in spotting problems early on. The "Pentium debacle" is a perfect example of what Lucky is decrying, as internal memos on the problem had been basically pooh-poohed and ignored, until a major public relations disaster hit.) And this has always been a major role of extra-corporate agents: safety inspectors, insurance companies, independent testing laboratories, and so on. The in-house testing departments are frequently inclined to overstate concerns (known universally as "CYA," for "cover your ass"), so it is not surprising that their concerns are often treated as a non-urgent matter. Until a crisis happens, then they are lambasted for not having spoken up more loudly and more forcefully. This was true in ancient Sumeria, in the early factories in Europe, on the communes in China, and in the high-tech labs of today. An easily understandable mixture of psychology, systems analysis, group dynamics, economics, and evolutionary game theory. The Cypherpunks group is, to some extent, helping in this process by trying to break or cripple new software. (As several of us have noted, the NSA's second official role, that of securing commercial cryptography, COMSEC, seems to have been ignored. We are thus left to fill in for these slackers.) --Tim May We got computers, we're tapping phone lines, we know that that ain't allowed. ---------:---------:---------:---------:---------:---------:---------:---- Timothy C. May | Crypto Anarchy: encryption, digital money, tcmay@got.net 408-728-0152 | anonymous networks, digital pseudonyms, zero W.A.S.T.E.: Corralitos, CA | knowledge, reputations, information markets, Higher Power: 2^756839 - 1 | black markets, collapse of governments. "National borders aren't even speed bumps on the information superhighway."