On Fri, 2 Aug 2002, Wall, Kevin wrote:
> First off, let me say that in general, I am against almost everything
> that the DCMA stands for and am no fan of DRM either. But I do think that
> we will lose credibility if we can't substantiate our claims, and part of
> that means recognizing and acknowledging what appears to be legitimate
> claims from the TCPA side.
Please forgive me for being too short in my indication of what a better
longer response from me would look like, which better longer response I
hope to include in a formal submission to the Department of Commerce
taskforce on DRM.
There is nothing to be said in favor of DRM. DRM is simply the name for
the system under which the Englobulators would have root on every home and
small business computer on Earth.
ad propaganda: If we admit the principle that it is reasonable to outlaw
the sale of computers to individuals and to outlaw the private use of
computers, we place ourselves in a false posture, and a strategically
weaker position. The present situation is not that of twenty-five years
ago when the VCR was coming to be used in private homes. The struggles of
those days were about trammels on limited purpose devices. DRM is not one
trammel on a limited device, nor is it even a set of trammels on several
differtent special purpose devices.
In the above paragraph I use the word "computer" to mean computers of the
sort we have today, that is, computers which have no wiretaps and no remote
control machinery in them.
ad my repeated rhetorical question "Claimed advantage to me here?": It was
an error of rhetoric to put these questions in my response to AAARG!.
These questions require consideration of indirect effects, which may only
be roughly estimated, if we wish to be precise at the two nines level. But
in each case, when one runs down the game/rhetoric tree, one sees that
there is never any benefit to me in the claimed useful-to-all capabilities
of DRM. I will not be able to force my wiretaps and my remote controls on
RIAA-MPAA-AAP. As pointed out, section 4.12 of the Final Report of the
BPDG, simply specifies that, when DRM is forced on the world, Englobulator
machines will have no TCPA/Palladium/wiretaps/remote-controls in them.
To deal with the tiny bit of truth in the claims of AARG! that some
capabilities of DRM might be beneficial to me: Yes, of coures, there are
few things that have zero benefits. But this is hardly relevant. A more
relevant question here is: Can we get the benefits in a better way? And of
course, we can. For the purposes of this narrow and hypothetical
discussion, DRM might just be considered as a dongle forced on every home
computer in the world. The claims of benefit depend on this dongle being
usable by me to make sure that you do not do certain things with my
program/data when it is running on your computer, e.g., distribute the
movie I send you. Well, why must the dongle be on the whole computer
system? Why cannot it be simply a dongle that goes in a slot in a special
TV screen/speaker system? Now this is a "product"!, why we'll sell 'em the
screens and we'll sell the dongle separately, etc.. Of course, the
Englobulators have no interest in making and selling such dongles. Indeed,
were Phillips to start making and selling such, somehow a legal cause of
action against Phillips would be discovered and the suits would commence.
oo--JS.
>
> Having said that, let me plunge right in and proceed to mark a complete
> fool of myself. Besides, so what if another hundred spambots harvest
> my e-mail address for breast enlargement ads (stupid spambots--think
> they could at least use my name to determine my sex and send me the
> herbal Viagra ads instead. ;-)
>
> Note that I'm interpreting Jay's reiterated question of
> "Claimed advantage to me here?" in the more general sense of
> advantage to anyone rather than to Jay personally. Not knowing
> him, the latter would be a rather difficult assessment to make.
>
> So, on with it already. Open mouth, insert foot... (yumm..
> filet of sole)...
>
> Jay Sulzberger writes...
>
> > On Thu, 1 Aug 2002, AARG!Anonymous wrote:
> >
> > > Eric Murray writes:
> > > > TCPA (when it isn't turned off) WILL restrict the software that you
> > > > can run. Software that has an invalid or missing signature won't be
> > > > able to access "sensitive data"[1]. Meaning that unapproved software
> > > > won't work.
> > > >
> > > > [1] TCPAmain_20v1_1a.pdf, section 2.2
> > >
> > > We need to look at the text of this in more detail. This is from
> > > version 1.1b of the spec:
> > >
> > > : This section introduces the architectural aspects of a Trusted
> > > : Platform that enable the collection and reporting of integrity
> > > : metrics.
> > > :
> > > : Among other things, a Trusted Platform enables an entity to
> > > : determine the state of the software environment in that platform
> > > : and to SEAL data to a particular software environment in that
> > > : platform.
> >
> >
> > Claimed advantage to me here?
>
> If you produce copyrighted materials that you don't want others to
> illegal copy, it can protect your assets. Might also be useful in
> protecting state secrets, but general crypto is sufficient for
> that. (Don't need it at the hardware level unless you are worried
> that some TLA gov't agency is out to get you.)
>
> The advantage depends on one whether is a producer of goods, or merely
> a consumer. I shall not make a judgement call as to which is more
> important. Suffice it to say that both need each other.
>
> [more from TCPA spec]
> > > :
> > > : The entity deduces whether the state of the computing environment in
> > > : that platform is acceptable and performs some transaction with that
> > > : platform. If that transaction involves sensitive data that must be
> > > : stored on the platform, the entity can ensure that that data is held
> > > : in a confidential format unless the state of the computing environment
> > > : in that platform is acceptable to the entity.
> >
> > Claimed advantage to me here?
>
> One could use this to detect virus infected systems, systems infected
> with root kits, etc., could they not? Also, ones alluded to above
> come to mind.
>
> > > :
> > > : To enable this, a Trusted Platform provides information to enable
> > > : the entity to deduce the software environment in a Trusted Platform.
> > > : That information is reliably measured and reported to the entity.
> > > : At the same time, a Trusted Platform provides a means to encrypt
> > > : cryptographic keys and to state the software environment that must
> > > : be in place before the keys can be decrypted.
> > >
> > > What this means is that a remote system can query the local TPM and
> > > find out what software has been loaded, in order to decide whether to
> > > send it some data. It's not that unapproved software "won't work",
> > > it's that the remote guy can decide whether to trust it.
> >
> > Claimed advantage to me here?
>
> Well, here's one place that I can see a potential value to consumers.
> I've thought a lot about how one can secure peer-to-peer (P2P) systems.
>
> Sure, if I want to allow my box be a P2P host, I can use a sandboxing
> technique to control and restrict (at least in theory) what rights I
> give other programs to run. [I'm think of a sense similar to the Java
> sandbox used for running applets.]
>
> However, the more interesting, and I believe more challenging piece is
> what guarentees can you give *ME* as a user of P2P services if I design
> some important code that I wish to utilize some generic P2P service.
> Unless I want to pay specific services for a P2P or grid computing from
> some company that I might happen to trust, be it IBM, HP, or whomever,
> I'll probably use some (future?) P2P services that are open sourced freeware
> that typical home users might host out of the generosity of their hearts
> (whereby they allow others to use some of their spare cycles). While this
> is all well and good, my level of trust would likely not be at the same
> level it would be if I paid a company to use their services. The feeling
> being if I buy a service from a reputable company and they intentionally
> do something malicious such as steal private data, etc. I can haul their
> butts to court. No such luck when dealing with the faceless masses.
> Bottom line seems to be that you get what you pay for. In particular,
> I'd be afraid that a few rogues might intentionally try to screw up
> my calculations giving me bad results or run a debugger and examine my
> data while it is unencrypted for some short part of the calculation,
> etc. How do I prevent that? Well, I don't think that it can necessarilly
> be PREVENTED, but all I really need to do is detect it...preferably
> before hand.
>
> Thus it would seem that giving the ability of a remote system to
> query a particular system's local TPM to see whether it is "trustworthy"
> (by whatever criteria that *I* decide) is just what the doctor ordered
> in this case.
>
> Or am I missing something here? Without this, I don't see how I would
> ever trust all the faceless masses P2P network for anything remotely
> critical or sensitive to me.
>
> > > Also, as stated earlier, data can be sealed such that it can only be
> > > unsealed when the same environment is booted. This is the part above
> > > about encrypting cryptographic keys and making sure the right software
> > > environment is in place when they are decrypted.
> >
> > Claimed advantage to me here?
> >
>
> Your turn. My little fingers are getting weary. Someone else take it from
> here.
>
> > > > Ok, technically it will run but can't access the data,
> > > > but that it a very fine hair to split, and depending on the nature
> > > > of the data that it can't access, it may not be able to run in truth.
> > > >
> > > > If TCPA allows all software to run, it defeats its purpose.
> > > > Therefore Wagner's statement is logically correct.
> > >
> > > But no, the TCPA does allow all software to run. Just because a
> > remote
> > > system can decide whether to send it some data doesn't mean that
> > software
> > > can't run. And just because some data may be inaccessible because it
> > > was sealed when another OS was booted, also doesnt mean that software
> > > can't run.
> >
> > Claimed advantage to me here?
> >
>
> I think that we had better define our terms here. What does it mean
> for a program to "run". I think most of us would hold that we mean
> that it executes in a way that provides the normal and generally
> expected functionality. Which would mean that if I put in my own
> copy of a audio CD that I burned for a backout copy, it should play
> the audio CD without any loss of quality rather than telling me that
> I have a pirated copy and that it's going to report me to MPAA or RIAA.
>
> However, I'm not going into any advantages or disadvantages.
> For the most part, I agree with Ross and David not because what
> they state necessarily is the intent of the TCPA or Palladium
> today, but because I believe that in general both humans and
> therefore human corporations are in essence greedy and seedy
> (not necessarily in that order).
>
> Of course, I have to add that I speak for myself (most of the time;
> sometimes my lips just move but some other voices come out ;) rather
> than for my company. Etc.
>
> -kevin wall