NYTimes oped: Federal laws better than censorware
Check out the last paragraph of Andrew's op-ed: it's Larry Lessig's argument, though conveniently unattributed. It's also a dangerous one, and a favorite of leftists, claiming that "accountable" government regulations are somehow better than "unaccountable" private classification schemes. Of course Andrew neglects to say that the CDA was not just civil regulation like many FCC rules: it, and its successor, are criminal laws with serious jail time and up to quarter-million dollar fines if you violate them. He also neglects to say that the reason PICS was created is pressure from the Feds. Try as hard as they may, not even RSACi can throw you in jail -- unless Congress passes Murray's bill, which means it's no longer private-sector action. If the market VOLUNTARILY comes up with a rating system, I don't know how you can say that's worse than government censorship and possible jail time. (Read Solveig's op-ed on this, which I posted earlier this week.) Arguing that private selection is "worse" than government censorship is simply incoherent. If government is coercing industry to adopt a scheme, which is the direction we're heading now, then it's time to make the argument that the government pressure amounts to state action. Then eventually challenge it in court. I'm not sure if Andrew is going here or not, but some leftists (or "liberals," if you like) oppose rating systems and censorware because they think children have a general right to access information. Even if their parents buy the computers. I think this is another incoherent argument that we should be careful not to buy into. It may be a bad idea for parents to install such programs -- as it may be a bad idea to feed Junior poptarts instead of bran cereal -- but children do not have a of Constitutional right to have censorware-free computers. -Declan ============== Opinion: The Danger of Private Cybercops By ANDREW L. SHAPIRO At a conference this week on protecting children from the perils of the Internet, consensus emerged on a strategy to keep minors away from cyberporn: let the private sector handle it. Rather than relying on Government regulation, Vice President Al Gore said, parents should look to industry for tools that will let them filter Internet content. Civil libertarians are largely responsible for the success of this approach. Indeed, they convinced the Supreme Court that it would do less harm to free speech than the Communications Decency Act, the law criminalizing on-line indecency, which the Court struck down in June. Yet those advocates may now regret what they wished for, because some of their schemes seem to imperil free speech more than the act did. For example, software that users install to block out certain Internet content often excludes material that isn't indecent. One such program, Cybersitter, prevents users from visiting the site of the National Organization for Women. And the makers of these programs often won't even tell adults what sites have been blacklisted. Still worse is a protocol known as PICS that changes the Internet's architecture to make it easy to rate and filter content. PICS is theoretically neutral because it allows different groups to apply their own labels, but could hurt the Internet's diversity by requiring everything to be rated. Small, unrated sites would be lost. Moreover, these technologies enable what might be called total filtering, where objectionable speech of any type can be screened out effortlessly. Benign as this may seem, such filtering might be used not just by individuals but by employers, Internet service providers and foreign governments seeking to restrict information that others receive. The ground rules for an open society could also be undermined. When total filtering meets information overload, individuals can (and will) screen out undesired interactions, including those crucial to a vibrant political culture -- the on-line equivalents of a civil rights protest or a petition for a reform candidate. In such a filtered society, civil discourse and common understanding will suffer. This should lead us to think long and hard about the way that technology can be an even more cunning censor than law. That's not to say that Government solutions are problem-free or desirable. But at least when the state goes overboard, speech defenders have the safety valve of a First Amendment lawsuit. This legal recourse is not an option when politicians simply persuade industry and consumers to use speech-inhibiting tools. Who knows, free-speech advocates may find themselves nostalgic for public regulation after all. Andrew L. Shapiro is a fellow at Harvard Law School's Center for no the Internet and Society and at the Twentieth Century Fund.
Hold your horses, folks. At 11:02 AM 12/4/97 -0500, Declan McCullagh wrote:
Check out the last paragraph of Andrew's op-ed: it's Larry Lessig's argument, though conveniently unattributed.
For space reasons, the Times cut my attribution to Larry (as he knows and will tell you). The piece was 700 words at 7:00 pm yesterday, 425 words at 7:15. Believe me, I was sorry not to be able to credit my friend, colleague, and former teacher. And incidentally, it was the third-to-last graph, not the last (is this kind of looseness with the facts a coincidence?).
Of course Andrew neglects to say that the CDA was not just civil regulation like many FCC rules: it, and its successor, are criminal laws
Hello? in graph 2: "Communications Decency Act, the law *criminalizing* on-line indecency..."
He also neglects to say that the reason PICS was created is pressure from the Feds.
You're wrong or overstating the case. PICS began as an effort -- rightly enough -- to *respond to* and/or *stave off* laws like the CDA. But did the Feds "pressure" anyone to come up with PICS? No. I just double-checked with someone linked to PICS's founding, who told me: "Nobody in the federal government ever came to the 3WC and told them to create PICS." And even if the Feds had "pressured" someone to do so, that wouldn't in anyway justify its speech-inhibiting design features. Now, as to whether politicians are pressuring industry to *use* PICS and other total filtering schemes, that's another question.
Try as hard as they may, not even RSACi can throw you in jail
That's irrelevant, Declan. Day-to-day, speech can be inhibited as much by technology as by law. In fact, you're the one who's shown us that so well with your countless posts about the dangers of censorware.
If the market VOLUNTARILY comes up with a rating system, I don't know how you can say that's worse than government censorship and possible jail time.
That's not what I said. I'm not in favor of censorship and I oppose any attempt to *criminalize* 'indecent' speech. But criminalize does not = regulate.
children do not have a of Constitutional right to have censorware-free computers.
Really! I seem to recall *you* making the argument that kids have first amendment rights to access any information, particularly in public facilities like libraries. The 17 1/2 year old college freshman perhaps? Did you change you mind? -- Andrew
==============
Opinion: The Danger of Private Cybercops
By ANDREW L. SHAPIRO
At a conference this week on protecting children from the perils of the Internet, consensus emerged on a strategy to keep minors away from cyberporn: let the private sector handle it. Rather than relying on Government regulation, Vice President Al Gore said, parents should look to industry for tools that will let them filter Internet content.
Civil libertarians are largely responsible for the success of this approach. Indeed, they convinced the Supreme Court that it would do less harm to free speech than the Communications Decency Act, the law criminalizing on-line indecency, which the Court struck down in June.
Yet those advocates may now regret what they wished for, because some of their schemes seem to imperil free speech more than the act did.
For example, software that users install to block out certain Internet content often excludes material that isn't indecent. One such program, Cybersitter, prevents users from visiting the site of the National Organization for Women. And the makers of these programs often won't even tell adults what sites have been blacklisted.
Still worse is a protocol known as PICS that changes the Internet's architecture to make it easy to rate and filter content. PICS is theoretically neutral because it allows different groups to apply their own labels, but could hurt the Internet's diversity by requiring everything to be rated. Small, unrated sites would be lost.
Moreover, these technologies enable what might be called total filtering, where objectionable speech of any type can be screened out effortlessly. Benign as this may seem, such filtering might be used not just by individuals but by employers, Internet service providers and foreign governments seeking to restrict information that others receive.
The ground rules for an open society could also be undermined. When total filtering meets information overload, individuals can (and will) screen out undesired interactions, including those crucial to a vibrant political culture -- the on-line equivalents of a civil rights protest or a petition for a reform candidate. In such a filtered society, civil discourse and common understanding will suffer.
This should lead us to think long and hard about the way that technology can be an even more cunning censor than law. That's not to say that Government solutions are problem-free or desirable. But at least when the state goes overboard, speech defenders have the safety valve of a First Amendment lawsuit. This legal recourse is not an option when politicians simply persuade industry and consumers to use speech-inhibiting tools. Who knows, free-speech advocates may find themselves nostalgic for public regulation after all.
Andrew L. Shapiro is a fellow at Harvard Law School's Center for the Internet and Society and at the Twentieth Century Fund.
I still don't understand why it is "censorship" when any company can come up with any software that rates sites according to any scheme, and anyone can choose to use any package, or ignore the software altogether. there is total freedom in all of this. Declan, why is it that you are now editorializing against an editorial that asks for government standards & laws instead of free market ones? are you starting to finally figure out that private enterprise filtering systems, while having huge aspects that are not all that pleasant, are superior to the alternative? (btw, I don't like the claims of the editorial either, but that has always been my position on this issue-- that private enterprise systems are superior to government censorship) I agree that PICS was introduced in part to try to come up with a solution to the problem of offensive content that could be presented as an alternative to any government involvement. people on the net want to solve their own problems on the net, without laws, in general. everyone who continues to rant against filtering companies strike me as people who are screaming sour grapes. "we don't like the choices these companies have made!!" but just start your own!! the market is deciding what filtering company is doing the best job, mostly regardless of your ranting. and surprise!! guess what!! the market may not actually decide that it even cares whether filtering products are up front about informing what sites they filter. what, it takes a lot of work to filter sites? well, you're damn right-- doing anything of value requires a lot of work, and the filtering companies are working hard to improve their technology, no thanks to the screechings of a lot of people who feel that they have some better way of judging filtering software than the parents who use it. the net will continue to support schemes that help separate, segregate, and rate content, and those who reject such ideas as "censorship" are going to be seen as increasingly out-of-touch and clueless about how the technology works. does anyone claim it is censorship because a service interested in rating "cool sites" does not rate many sites it thinks are not cool? why then is there so much controversy when a *service* designed to rate *sites acceptable to children* does not include certain sites? can anyone tell me the difference? answer: many people wish to be the judge of what children can and cannot see. but ultimately, does anyone other than a parent have the authority to do this? in a free society, which I think we still live in, that is? if you think you are a better judge of what children should see, create your own service that includes whatever you think is being excluded. the market may support you. or, the market may thumb its nose at you. (however, postscript to all of the above, I do agree that any government laws making filtering software in some way mandatory is bogus and abhorrent.)
I think the answer to your question is that most of us find the very idea of filterware distasteful (let alone the generally poor quality of the implementations. However, that's much the same as I find american cars distasteful. If somebody else wants to blow their money on one of those pieces of crap, by all means let them. (I admit, they're slowly getting better) This leads us to poke fun at the current systems and argue against people actually using them, much the same way a christian friend of mine keeps trying to get me to accept jesus into my heart and love him so that I won't go to hell. On the other hand, we get really pissed off when somebody tries to force us and our kids to use this crap. It doesn't matter if it is AOL or the FBI, outside coercion is outside coercion. At 04:48 PM 12/7/97 -0800, Vladimir Z. Nuri wrote:
I still don't understand why it is "censorship" when any company can come up with any software that rates sites according to any scheme, and anyone can choose to use any package, or ignore the software altogether. there is total freedom in all of this.
Declan, why is it that you are now editorializing against an editorial that asks for government standards & laws instead of free market ones? are you starting to finally figure out that private enterprise filtering systems, while having huge aspects that are not all that pleasant, are superior to the alternative? (btw, I don't like the claims of the editorial either, but that has always been my position on this issue-- that private enterprise systems are superior to government censorship)
I agree that PICS was introduced in part to try to come up with a solution to the problem of offensive content that could be presented as an alternative to any government involvement. people on the net want to solve their own problems on the net, without laws, in general.
everyone who continues to rant against filtering companies strike me as people who are screaming sour grapes. "we don't like the choices these companies have made!!" but just start your own!! the market is deciding what filtering company is doing the best job, mostly regardless of your ranting. and surprise!! guess what!! the market may not actually decide that it even cares whether filtering products are up front about informing what sites they filter.
what, it takes a lot of work to filter sites? well, you're damn right-- doing anything of value requires a lot of work, and the filtering companies are working hard to improve their technology, no thanks to the screechings of a lot of people who feel that they have some better way of judging filtering software than the parents who use it.
the net will continue to support schemes that help separate, segregate, and rate content, and those who reject such ideas as "censorship" are going to be seen as increasingly out-of-touch and clueless about how the technology works.
does anyone claim it is censorship because a service interested in rating "cool sites" does not rate many sites it thinks are not cool? why then is there so much controversy when a *service* designed to rate *sites acceptable to children* does not include certain sites? can anyone tell me the difference? answer: many people wish to be the judge of what children can and cannot see. but ultimately, does anyone other than a parent have the authority to do this? in a free society, which I think we still live in, that is?
if you think you are a better judge of what children should see, create your own service that includes whatever you think is being excluded. the market may support you. or, the market may thumb its nose at you.
(however, postscript to all of the above, I do agree that any government laws making filtering software in some way mandatory is bogus and abhorrent.)
-Colin
At 4:48 PM -0800 12/7/97, Vladimir Z. Nuri wrote:
I still don't understand why it is "censorship" when any company can come up with any software that rates sites according to any scheme, and anyone can choose to use any package, or ignore the software altogether. there is total freedom in all of this.
If that were all that were going on, no one would argue with it. The problem arises when government bodies require use of a particular filter (as in libraries), or require publishers to attach derogatory labels to the information they publish (as in too many trial balloons for me to list). These are abuses of freedom, and if you pay attention, you'll see that such applications of filter and labeling programs receive the lion's share of vocal opposition. There are a couple of subsidiary problems - one is that current filterware is, not to put too fine a point on it, terrible to the point of constituting fraud on the consumer; another is with the issue of minors' rights to access information they need, and the limits that parents and schools must observe in restricting those rights while still fulfilling their responsibilities. Current filterware would still be criticized - rightly so - even if it weren't being used for censorship. But the fact is that it is being so used. Someday, perhaps the threat of censorship-via-filters will go away and we can spend less time discussing them. -- Morning people may be respected, but night people are feared.
At 10:37 PM -0500 12/7/97, Mike Godwin wrote:
At 12:00 PM -0500 12/4/97, Andrew Shapiro wrote:
That's not what I said. I'm not in favor of censorship and I oppose any attempt to *criminalize* 'indecent' speech. But criminalize does not = regulate.
Perhaps it is a flaw in my legal education, but I was always taught that criminal laws were a form of regulation.
What I meant to say, Mike, was that not all regulation, obviously, is criminal. A.
At 11:02 -0500 12/4/97, Declan McCullagh wrote:
Check out the last paragraph of Andrew's op-ed: it's Larry Lessig's argument, though conveniently unattributed.
Let me retract this particular statement. I'm told that Lessig was properly cited then edited out late last night. -Declan
At 12:00 PM -0500 12/4/97, Andrew Shapiro wrote:
You're wrong or overstating the case. PICS began as an effort -- rightly enough -- to *respond to* and/or *stave off* laws like the CDA. But did the Feds "pressure" anyone to come up with PICS? No. I just double-checked with someone linked to PICS's founding, who told me: "Nobody in the federal government ever came to the 3WC and told them to create PICS."
I believe the first version of the Exon Amendment was introduced in late summer of 1994. I believe PICS postdates this, but I am not certain.
And even if the Feds had "pressured" someone to do so, that wouldn't in anyway justify its speech-inhibiting design features.
Does PICS inhibit speech in e-mail or in Usenet newsgroups or in FTP sites? Isn't PICS just the Web? Conversely, doesn't any version of the CDA inhibit speech in, e.g., e-mail, Usenet newsgroups, and FTP sites? I cannot conceive of any version of the CDA that does not restrict speech more broadly, and have a greater chilling effect, than PICS. Note that this is not a defense of PICS.
That's irrelevant, Declan. Day-to-day, speech can be inhibited as much by technology as by law.
I don't believe PICS, whatever its other flaws, poses any threat of putting speakers of unapproved speech in jail.
That's not what I said. I'm not in favor of censorship and I oppose any attempt to *criminalize* 'indecent' speech. But criminalize does not = regulate.
Perhaps it is a flaw in my legal education, but I was always taught that criminal laws were a form of regulation.
children do not have a of Constitutional right to have censorware-free computers.
Really! I seem to recall *you* making the argument that kids have first amendment rights to access any information, particularly in public facilities like libraries.
I cannot speak to what Declan's argument actually was, but I note that these two positions are not, in fact, logically inconsistent. ---Mike ---------------------------------------------------------------------------- We shot a law in _Reno_, just to watch it die. Mike Godwin, EFF Staff Counsel, is currently on leave from EFF, participating as a Research Fellow at the Freedom Forum Media Studies Center in New York City. He can be contacted at 212-317-6552. ----------------------------------------------------------------------------
participants (6)
-
Andrew Shapiro
-
Colin A. Reed
-
Declan McCullagh
-
Jeanne A. E. DeVoto
-
Mike Godwin
-
Vladimir Z. Nuri