noscript is 10 years!

dan at geer.org dan at geer.org
Sat May 16 07:05:55 PDT 2015


..snip..

> The problem, ultimately, is features. And it will always be features.

That is correct as an observation.  I don't like it, but the
world does not care whether I like it or not; I am 1/7000000000
regardless of my skill, taste, or persuasiveness.  And so it has
always been (cf. "bread and circuses").

However, what I object to is the tendency of features to destroy
functionality by way of collateral damage, viz., for platforms to
be constructed to deliver features and only to deliver features.
That is what "freedom to tinker" fears.  That is what risk is
all about, risk being solely a consequence of that upon which
you depend.  That is why I've all but stopped buying new things
(computers, cars, appliances, etc.) -- their orientation around
features reduces my ability to configure, to repair, nay even to
understand what is going on inside, much less that it is legally
questionable as to whether I even own them despite having paid
my money for them.[*]  (Even were I willing to run Javascript, my
old computers can no longer handle the burgeoning demands --
Javascript has clearly become the technologic embodiment of
"When rape is inevitable, relax and enjoy it.")

Big data, especially of the so-called deep learning kind, is
of a parallel sort.  Where data science spreads, a massive
increase in tailorability to conditions follows.  Even if
Moore's Law remains forever valid, there will never be enough
computing hence data driven algorithms must favor efficiency
above all else, yet the more efficient the algorithm, the less
interrogatable it is, that is to say that the more optimized the
algorithm is, the harder it is to know what the algorithm is
really doing.

The more desirable some particular automation is judged to be,
the more data (which is to say foodstuffs) it is given.  The
more data it is given, the more its data utilization efficiency
matters.  The more its data utilization efficiency matters, the
more its algorithms will evolve to opaque operation.  Above some
threshold of dependence on such an algorithm in practice, there
can be no going back.  As such, if data science wishes to be
genuinely useful, preserving algorithm interrogatability despite
efficiency-seeking, self-driven evolution is the research grade
problem now on the table.  If science does not pick this up,
then Lessig's characterization of code as law is fulfilled.

In short, features drive.  They drive because of democratic
principles evidenced by immensely rapid uptake.  They rely upon a
user base that is forever "barefoot and pregnant."  And it is
increasingly difficult to opt out of features without opting out
of society altogether.  As there is zero difference between
"personalization" and "targeting" beyond the intent of the
algorithm, those who don't accept features will be adjudged
anomalous, and we already treat anomaly detection as the
sine qua non of public safety.

--dan

[*] http://www.wired.com/2015/04/dmca-ownership-john-deere/




More information about the cypherpunks mailing list