Tim May writes:
Draw this graph I outlined. Think about where the markets are for tools for privacy and untraceability. Realize that many of the "far out' sweet spot applications are not necessarily immoral: think of freedom fighters in communist-controlled regimes, think of distribution of birth control information in Islamic countries, think of Jews hiding their assets in Swiss bank accounts, think of revolutionaries overthrowing bad governments, think of people avoiding unfair or confiscatory taxes, think of people selling their expertise when some guild says they are forbidden to.
It is good to see some frank discussion of morality here. Too often cypherpunks seem to assume that anything that can be done, should be done. However on closer examination it's not clear that many of the examples above satisfy both financial and moral constraints. "Freedom fighters in communist-controlled regimes." How much money do they have? More importantly, how much are they willing and able to spend on anonymity/privacy/black-market technologies? These guys aren't rolling in dough. "Revolutionaries overthrowing bad governments." The main revolutionaries who will be willing to pay money are those who expect to get rich from their revolution. These are the ones who want to throw out the tyrants so they can set themselves up as new tyrants. It is people like this who would be the best customers of cypherpunk technology. You're not making the world a better place by giving them tools. "Distribution of birth control information in Islamic countries." Again, selling to Planned Parenthood is not a business plan which will make anyone rich. "Jews hiding their assets in Swiss bank accounts." Financial privacy is in fact potentially big business, but let's face it, most of the customers today are not Jews fearing confiscation by anti-semitic governments. That's not in the cards. Most of the money will be tainted, and even if it is largely drug money and you don't think drugs should be illegal, much drug money is dirty even by libertarian standards. It is used for bribes, for coercion, even for murder. Facilitating such activities does not help to make drugs legal, it just gives murdering drug lords more wealth and power and provides justification for increasing military funding to fight the drug war. "People avoiding unfair or confiscatory taxes." This is a good one, lots of customers, plenty of money, few moral problems. Even if you support some government programs, it will take a long time before enough people adopt privacy protection tools that it could have a significant impact on government tax revenues. The big problem here is coming up with the a technology that can do the job. "People selling their expertise when some guild says they are forbidden to." Morally this one seems OK. In a net already filled with bogus medical and legal advice it can't make things much worse. On the other hand it's not clear that the existing prohibitions are hurting anyone's bottom line. How much can you really expect to make by selling forbidden advice? It's not clear that there is much of a market for this technology but possibly someone could find a killer app here. The conclusion is that you need to add a third axis to Tim's graph: morality, in addition to value and cost. Many of the most lucrative potential uses of anonymity technologies are morally questionable. If you add this additional filter you are forced to focus on just a few application areas (with the additional complication that few people will agree on morality, and that morality and legality often have little overlap).