This is a really subtle issue. Much has been written about how to optimize mixing pools. 6-12 hours is a really long delay for many purposes. If not everyone is doing so, long delay messages might turn out to be of particular interest.
It also seems like a bad idea to put the message holding function at the sender's end. That makes it easier to try to identify who might have been storing messages for later delivery.
This might be a very simple and interesting service to provide at the end of remailer chains. Exit remailers might have an additional command which would instruct them to hold the message for a given period or until a given time before final delivery.
With Mixmaster I spent a lot of time thinking about message size. If you can recognize a message from its size as it enters and leaves a node, then all the delay and mixing is effectively thwarted.
-Lance
Possibly, I will defer to the more technically learnt.
I'm not a nym server expert but from my laymen perspective the Pynchon Gate design looks good. It might be totally redundant and unnecessary but if metadata analysis is the concern, wouldn't such a setup be even more secure by coding something so that the time between sending a message and receiving a reply which in theory could leak information about the nym holder, be sent at a random date in a given time-frame (unbeknownst to the metadata leeches) . i.e. In 6-12 hours from the moment I click "send" or say in 12-20 days etc.
The email message could be coded to send at random like an online roulette table ball, within a given time window: verses say reloading every 24hours. This would in theory give out incorrect message 'sent' time-stamps, or would this be unnecessary because traffic from the user to the email distributors is already being controlled by the user, which queries into intervals anyway? Is that not metadata that can be tracked?
- J