USG pulls 'sensitive' info off net
Must've never heard of caching.. http://www.latimes.com/news/nationworld/nation/la-100301safe.story Several federal agencies have removed sensitive documents and reports from their Internet sites following the Sept. 11 terrorist attacks, saying they want to keep the information out of the wrong hands. The Department of Transportation has removed its national mapping system for a variety of pipelines. The Department of Health and Human Services yanked a report on the dangers of chemical plant terrorism. The Environmental Protection Agency pulled information on risk-management programs, which inform communities of dangers from 15,000 chemical plants and other industrial facilities nationwide. The widespread editing illustrates how swiftly federal agencies have switched gears following the attacks. Although community activists have lobbied for years for more access to records about nuclear plants and other facilities, agencies now fear that such access may put the public at risk. "Recent events have focused additional security concerns on critical infrastructure systems," said a note posted online by the Office of Pipeline Safety within the Transportation Department. "At this time, [the office] is providing pipeline data to pipeline operators and local, state and federal government officials only." White House officials say they have not issued a blanket order to federal agencies to remove sensitive documents from government Web sites. "We would only hear about these things if we were asked to advise on them," said E. Floyd Kvamme, co-chairman of the President's Committee of Advisors on Science and Technology. However, a spokesman for the Nuclear Regulatory Commission said his agency is working closely with the White House and Department of Defense to assure its Web site does not disclose potentially dangerous information. "We have been reviewing all the information on the Web site with an eye to removing anything that might be helpful to potential terrorists," said NRC spokesman Breck Henderson. For instance, if the site contained the exact geographic coordinates of a nuclear plant, that information would be removed, he said. "If we're a little overzealous in removing things, if there's something on there you really want, give us a Freedom of Information Act request," Henderson said. EPA emergency coordinator Jim Makris said he personally made the decision to remove--at least temporarily--information about risk-management plans submitted by industrial facilities as required by 1990 amendments to the Clean Air Act. The Risk Management Program Web site gave detailed information about 15,000 facilities, including executive summaries, emergency plans, accident histories and chemicals used on site. That data had been on the Internet since late 1999. "We just wanted to get it out of the way," Makris said. "We have made no decision that it will stay off." The information is still available to emergency managers, firefighters and others who need it, he said. Web security experts say the steps taken by the agencies are only "half measures," because the material could have been previously downloaded and saved on users' hard drives. In addition, some of the reports are still available in paper form, said Elias Levy, chief technology officer for SecurityFocus, a security information company in San Mateo. "If someone really wants to get it badly, as they're assuming possibly a terrorist would, they still would be able to get it," Levy said. "You simply have to jump through a lot of hoops. What they're going to end up doing is discouraging the public from obtaining the information, not necessarily discouraging the terrorists from doing so." The Government Printing Office, which prints most government documents and runs a chain of stores, has not been asked to pull any books or reports, deputy general counsel Drew Spalding said. But Transportation Department spokesman Lenny Alcivar said reviews similar to the one being conducted by his agency are taking place throughout the federal government. "This is not meant to restrict information on the part of the public, but more importantly to allow the department and the public the maximum protections against security threats as it can," Alcivar said. "It's important that government, across the board, do all that it can to heighten safety measures while at the same time continue to be as open and responsive to the public as possible."
Inevitable next step: Enterprising cypherpunk registers censoredfedinfo.org, hunts through google's cache, posts everything there, etc. -Declan On Wed, Oct 03, 2001 at 06:38:05AM -0700, Khoder bin Hakkin wrote:
Must've never heard of caching..
http://www.latimes.com/news/nationworld/nation/la-100301safe.story
Khoder bin Hakkin wrote:
Must've never heard of caching..
http://www.latimes.com/news/nationworld/nation/la-100301safe.story
Several federal agencies have removed sensitive documents and reports from their Internet sites following the Sept. 11 terrorist attacks, saying they want to keep the information out of the wrong hands.
The Department of Transportation has removed its national mapping system for a variety of pipelines. The Department of Health and Human Services yanked a report on the dangers of chemical plant terrorism. The Environmental Protection Agency pulled information on risk-management programs, which inform communities of dangers from 15,000 chemical plants and other industrial facilities nationwide.
The widespread editing illustrates how swiftly federal agencies have switched gears following the attacks. Although community activists have lobbied for years for more access to records about nuclear plants and other facilities, agencies now fear that such access may put the public at risk.
"Recent events have focused additional security concerns on critical infrastructure systems," said a note posted online by the Office of Pipeline Safety within the Transportation Department.
"At this time, [the office] is providing pipeline data to pipeline operators and local, state and federal government officials only."
White House officials say they have not issued a blanket order to federal agencies to remove sensitive documents from government Web sites.
"We would only hear about these things if we were asked to advise on them," said E. Floyd Kvamme, co-chairman of the President's Committee of Advisors on Science and Technology.
However, a spokesman for the Nuclear Regulatory Commission said his agency is working closely with the White House and Department of Defense to assure its Web site does not disclose potentially dangerous information.
"We have been reviewing all the information on the Web site with an eye to removing anything that might be helpful to potential terrorists," said NRC spokesman Breck Henderson. For instance, if the site contained the exact geographic coordinates of a nuclear plant, that information would be removed, he said.
"If we're a little overzealous in removing things, if there's something on there you really want, give us a Freedom of Information Act request," Henderson said.
EPA emergency coordinator Jim Makris said he personally made the decision to remove--at least temporarily--information about risk-management plans submitted by industrial facilities as required by 1990 amendments to the Clean Air Act.
The Risk Management Program Web site gave detailed information about 15,000 facilities, including executive summaries, emergency plans, accident histories and chemicals used on site. That data had been on the Internet since late 1999.
"We just wanted to get it out of the way," Makris said. "We have made no decision that it will stay off." The information is still available to emergency managers, firefighters and others who need it, he said.
Web security experts say the steps taken by the agencies are only "half measures," because the material could have been previously downloaded and saved on users' hard drives. In addition, some of the reports are still available in paper form, said Elias Levy, chief technology officer for SecurityFocus, a security information company in San Mateo.
"If someone really wants to get it badly, as they're assuming possibly a terrorist would, they still would be able to get it," Levy said. "You simply have to jump through a lot of hoops. What they're going to end up doing is discouraging the public from obtaining the information, not necessarily discouraging the terrorists from doing so."
The Government Printing Office, which prints most government documents and runs a chain of stores, has not been asked to pull any books or reports, deputy general counsel Drew Spalding said.
But Transportation Department spokesman Lenny Alcivar said reviews similar to the one being conducted by his agency are taking place throughout the federal government.
"This is not meant to restrict information on the part of the public, but more importantly to allow the department and the public the maximum protections against security threats as it can," Alcivar said.
"It's important that government, across the board, do all that it can to heighten safety measures while at the same time continue to be as open and responsive to the public as possible."
I was looking up the way cool VASIMR plasma rocket, which would make 90-104 day transfers to Mars possible, but spaceflight.nasa.gov is not reachable. Must be a DNS problem. This degree of seeming idiotic paranoia is not healthy. It doesn't inspire confidence, either. The paternalistic (or is it maternalistic?) policy of keeping nobody but the American public in the dark doesn't help, it makes matters worse. And it makes people less apt to pay attention to voices of reason. Or know where to watch for suspicious activity. So? If that's the whole point, fuck that. If not, well, get a fucking grip, America. jbdigriz
on Wed, Oct 03, 2001 at 11:00:04AM -0400, Declan McCullagh (declan@well.com) wrote:
On Wed, Oct 03, 2001 at 06:38:05AM -0700, Khoder bin Hakkin wrote:
Must've never heard of caching..
http://www.latimes.com/news/nationworld/nation/la-100301safe.story
Inevitable next step: Enterprising cypherpunk registers censoredfedinfo.org, hunts through google's cache, posts everything there, etc.
Note that there are a relatively small number of Googles on the Net. This is a point that was brought forth pointedly at John Wharton's ee380 seminar at Stanford last spring, during Rhonda Hauben's seminar "Usenet and the Usenet Archives The Challenges of Building a Collaborative Technical Community" http://www.stanford.edu/class/ee380/Abstracts/010523.html Specifically, the Usenet community became used to the Deja News archives, to the extent that many independent archives of specific newsgroups or collections of groups were deactivated, relying instead on Deja. While comprehensive archives are useful, *single* comprehensive archives present a point of failure and control. The Net would be advised to develop multiple Google alternatives. And I say this as quite the fan of Google.... Peace. -- Karsten M. Self <kmself@ix.netcom.com> http://kmself.home.netcom.com/ What part of "Gestalt" don't you understand? Home of the brave http://gestalt-system.sourceforge.net/ Land of the free Free Dmitry! Boycott Adobe! Repeal the DMCA! http://www.freesklyarov.org Geek for Hire http://kmself.home.netcom.com/resume.html [demime 0.97c removed an attachment of type application/pgp-signature]
At 12:29 PM 10/3/2001 -0700, Karsten M. Self wrote:
on Wed, Oct 03, 2001 at 11:00:04AM -0400, Declan McCullagh (declan@well.com) wrote:
On Wed, Oct 03, 2001 at 06:38:05AM -0700, Khoder bin Hakkin wrote:
Must've never heard of caching..
http://www.latimes.com/news/nationworld/nation/la-100301safe.story
Inevitable next step: Enterprising cypherpunk registers censoredfedinfo.org, hunts through google's cache, posts everything there, etc.
Note that there are a relatively small number of Googles on the Net.
The trouble with Google and most other spiders is that they cannot access the DBs behind the sites. Various industry estimates place the amount of data not accessible to crawlers at up to 500x the html content. What's needed are open access data mining sites using more sophisticated crawlers like http://telegraph.cs.berkeley.edu/ steve
Steve Schear wrote:
... What's needed are open access data mining sites using more sophisticated crawlers like http://telegraph.cs.berkeley.edu/
What, you mean more sites that require Javascript to be turned on, and show a blank page if it isn't? Thanks, got plenty of them already. SRF <--- who is even more annoyed at sites that require IE extensions -- Steve Furlong Computer Condottiere Have GNU, Will Travel 617-670-3793 "Good people do not need laws to tell them to act responsibly while bad people will find a way around the laws." -- Plato
On Wed, Oct 03, 2001 at 12:29:09PM -0700, Karsten M. Self wrote:
While comprehensive archives are useful, *single* comprehensive archives present a point of failure and control. The Net would be advised to develop multiple Google alternatives. And I say this as quite the fan of Google....
It might be a nice idea, but it's expensive to implement and keep in operation. This will happen if there's a market demand, and there does not appear to be one. -Declan
Nincompoops! Remove all information of potential value to terrorists and you remove...all information. Time to flood the Feds with FOIA requests. Marc de Piolenc Khoder bin Hakkin wrote:
Must've never heard of caching..
http://www.latimes.com/news/nationworld/nation/la-100301safe.story
Several federal agencies have removed sensitive documents and reports from their Internet sites following the Sept. 11 terrorist attacks, saying they want to keep the information out of the wrong hands.
Great! Who'll do it? Marc de Piolenc Declan McCullagh wrote:
Inevitable next step: Enterprising cypherpunk registers censoredfedinfo.org, hunts through google's cache, posts everything there, etc.
-- Remember September 11, 2001 but don't forget July 4, 1776 Rather than make war on the American people and their liberties, ...Congress should be looking for ways to empower them to protect themselves when warranted. They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety. - Benjamin Franklin
Must've never heard of caching.. http://www.latimes.com/news/nationworld/nation/la-100301safe.story
Inevitable next step: Enterprising cypherpunk registers censoredfedinfo.org, hunts through google's cache, posts everything there, etc.
Note that there are a relatively small number of Googles on the Net.
The trouble with Google and most other spiders is that they cannot access the DBs behind the sites. Various industry estimates place the amount of data not accessible to crawlers at up to 500x the html content. What's needed are open access data mining sites using more sophisticated crawlers like http://telegraph.cs.berkeley.edu/
More to the point, most spiders don't cache images, only text, so much of the interesting content isn't cached. I'm not sure how many of them cache PDFs; some of those are searchable and indexable, while some are just bitmaps. On the other hand, the Feds generally don't have as much fancy-graphics-design-for-inaccessibility, so more of their text may be cacheable than typical business sites. Shutting down web sites with data that terrorists could use has been going on for a few years - apparently many of the haz-mat sites are no longer accessible to the public, including one of the Bay Area sites that shut down a few weeks before we had a major refinery fire. Yes, there are potential threats to public safety if terrorists can use this data, but there are more serious threats if the public can't use it to determine what's near them, and far more serious threats if fire departments can't access the data conveniently.
participants (8)
-
Bill Stewart
-
Declan McCullagh
-
F. Marc de Piolenc
-
James B. DiGriz
-
Karsten M. Self
-
Khoder bin Hakkin
-
Steve Furlong
-
Steve Schear