CIA Fears UmpTeen InfoNukes

Rick Smith smith at sctc.com
Fri Jun 28 18:56:56 PDT 1996


frantz at netcom.com (Bill Frantz) writes about the miserable state of
computer security, and I'll comment on some statements, hopefully
taken not too far out of order and context:

>I think that backward compatibility requirements are a significant part of
>the reason we see this problem.  The other part is, of course, that there
>is no market for security.

The phrase "backward compatibility" is, in my experience, a code
phrase for peoples' annoying habit of wanting to stay with things
they've finally made useful as opposed to having them replaced by
something "better" that is more expensive, less convenient, and in
general less familiar. The components of the latest release of MS
Office come to mind as a good example, and they didn't even include
public key crypto to atone for it.

I agree that it's seductive for us security weenies to think the tail
wags the dog, but let's remember what's really supposed to be happening.

The requirement isn't "backwards compatibility," the requirement is
that people get their work done. If the security threat keeps them
from getting their work done, then backwards compatibility is no
longer a major requirement.

>The problem, in general, isn't the system administrators.  If management
>gave the same priority to security that it does to joining new users or
>installing new hardware, sysadmins would have the time to install the
>patches.  Most sysadmins are up to their asses in alligators.  Security is
>something to put off.  If the managers were judged on the security of their
>systems (perhaps via independent audit), then they might give the problem
>some priority.

The problem (or at least the difference) is in the priorities
established by an organization's culture. Some would rather take the
risks and do things in a fairly open, if unpredictable, environment.
Some prefer and even thrive on predictability. Either approach can and
does produce valuable results. However, few people want to use a bank
that takes the "open, if unpredicable" approach. Banks have auditors.

>The ideal situation for them would be to use public key authentication
>because it would be entirely user-transparent. ...

Nonsense. The mere fact that it's not currently deployed guarantees
that it won't be user transparent. Vendors will include it on some
rewrite of whatever software it's embedded in. Memory requirements go
up and delays are introduced when the crypto computations are
performed. Security will be added only if it gives customers more
things they can do, so there'll be other functional changes as well.

In any case, working crypto *can't* be entirely user transparent.
People need to handle keys, choose the one to use, and update them
occasionally. There is a lot of training and cultural awareness
involved here that just doesn't exist yet.

And there will be *billions* in fraud before people finally learn,
then maybe it'll attenuate to mere millions (and I'm probably still
optimistic by orders of magnitude).  Look at credit and ATM cards. A
dozen years ago a bank issued us some ATM cards, and the clerk
insisted on writing the PIN code ON the cards. Very few banks do that
any more.

Rick.
smith at sctc.com            secure computing corporation






More information about the cypherpunks-legacy mailing list