I'll admit that I took the comment on complexity out of context but the larger context actually reinforces my point. The more constraints you put on a system the less likely you'll satisfy them so there is inherent complexity in the sense that the odds of finding an effective solution are significantly reduced. This is why we need to understand a fundamentally different dynamic based on discovery or opportunity. You have a solution and then find out what problems it solves. Once you look around you'll find that is the norm not the exception. We use USB for power because creating a standard with new requirements would've been too difficult. We may lament Facebook's security model but use it because we find value and deal with the trust problems, even if we don't do it very well. We can't really articulate what we mean by "trust" anyway and certainly not as a spec. Perhaps the most Internet-relevant example is defining "quality" outside the network rather than in the network. Thus instead of having to have a perfect network we get the kind of quality we want by choosing our own polices such as better never than late or vice versa. This is why the Internet is so many orders of magnitude less expensive than the traditional phone networks. In fact I argue that tolerance is the key to Moore's law as I wrote in http://rmf.vc/BeyondLimits. ------------------------------