Ross's TCPA paper

Seth David Schoen schoen at eff.org
Fri Jul 5 03:52:52 PDT 2002


Hadmut Danisch writes:

> You won't be able to enter a simple shell script through the
> keyboard. If so, you could simple print protected files as
> a hexdump or use the screen (or maybe the sound device or any
> LED) as a serial interface.
> 
> Since you could use the keyboard to enter a non-certified
> program, the keyboard is to be considered as a nontrusted
> device. This means that you either
> 
> * have to use a certified keyboard which doesn't let 
>   you enter bad programs
> 
> * don't have a keyboard at all
> 
> * or are not able to use shell scripts (at least not in
>   trusted context). This means a 
>   strict separation between certified software and data.

The latter is closest to what's intended in Palladium.  Individual
programs using Palladium features are able to prevent one another from
reading their executing or stored state.  You can write your own
programs, but somebody else can also write programs which can process
data in a way that your programs can't interact with.

The Palladium security model and features are different from Unix, but
you can imagine by rough analogy a Unix implementation on a system
with protected memory.  Every process can have its own virtual memory
space, read and write files, interact with the user, etc.  But
normally a program can't read another program's memory without the
other program's permission.

The analogy starts to break down, though: in Unix a process running as
the superuser or code running in kernel mode may be able to ignore
memory protection and monitor or control an arbitrary process.  In
Palladium, if a system is started in a trusted mode, not even the OS
kernel will have access to all system resources.  That limitation
doesn't stop you from writing your own application software or scripts.

Interestingly, Palladium and TCPA both allow you to modify any part of
the software installed on your system (though not your hardware).  The
worst thing which can happen to you as a result is that the system
will know that it is no longer "trusted", or will otherwise be able to
recognize or take account of the changes you made.  In principle,
there's nothing wrong with running "untrusted"; particular applications
or services which relied on a trusted feature, including sealed
storage (see below), may fail to operate.

Palladium and TCPA both allow an application to make use of
hardware-based encryption and decryption in a scheme called "sealed
storage" which uses a hash of the running system's software as part of
the key.  One result of this is that, if you change relevant parts of
the software, the hardware will no longer be able to perform the
decryption step.  To oversimplify slightly, you could imagine that the
hardware uses the currently-running OS kernel's hash as part of this
key.  Then, if you change the kernel in any way (which you're
permitted to do), applications running under it will find that they're
no longer able to decrypt "sealed" files which were created under the
original kernel.  Rebooting with the original kernel will restore the
ability to decrypt, because the hash will again match the original
kernel's hash.

(I've been reading TCPA specs and recently met with some Microsoft
Palladium team members.  But I'm still learning about both systems and
may well have made some mistakes in my description.)

-- 
Seth Schoen
Staff Technologist                                schoen at eff.org
Electronic Frontier Foundation                    http://www.eff.org/
454 Shotwell Street, San Francisco, CA  94110     1 415 436 9333 x107





More information about the cypherpunks-legacy mailing list