"network shock-testing"-resistant data serving
Answers anyone? I know wikileaks has a bad name with some, but let's set that aside, since I think it is an easy concept for lay people to understand, and to discuss this particular network challenge. Hypothetical: - I have a large a juicy data set leak, a particularly gruesome and indicting of the establishment, mil video. - I can upload it piecemeal and from varied and "random" locations, pre-scouted to be sure there are no surveillance cameras to monitor my piecemeal uploads. - But ultimately the result exists on some server, to serve up this juicy media frenzy fueling firetruck for the govt. - So downloaders can be "network shock tested" by interrupting the stream (with say a forged onion site), and voila - the end user for that particular stream is identified. - Similarly, downloading from the original/ non-forged onion site, say 100 'simultaneous' downloads, and I (network global viewer) can simultaneously stop all downloads and watch which physical host has a drop in outgoing load. So these are two problems - network shock-testing the end-user downloader, and network shock-testing the server uploader (no point having a juicy leak vid if you don't upload it to people for viewing). The only potential/ partial solutions I've heard so far: - pervasive fill traffic - hybrid/ alternate physical network Anyone got any other ideas and/ or elaboration on these potential partial solutions?
participants (1)
-
Zenaan Harkness