Power in the Age of the Feudal Internet - Bruce Schneier

Rayzer rayzer@riseup.net
Thu Jul 21 20:22:36 PDT 2016


I was probably wrong about my statement he doesn't have the technical
capability to examine code like ioerror

"He holds an MS degree in Computer Science from American University and
a BS degree in Physics from the University of Rochester."

Un-dated discussion paper from "MIND #6: Internet and Security":
http://en.collaboratory.de/w/Power_in_the_Age_of_the_Feudal_Internet

Power in the Age of the Feudal Internet

Discussion Papers > Internet and Security > Proposition

PROPOSITION

Bruce Schneier, Cryptographer and Computer Security Specialist and
Author of Liars and Outliers: Enabling the Trust Society Needs to Thrive

We’re in the middle of an epic battle for power in cyberspace. On one
side are the nimble, unorganized, distributed powers such as dissident
groups, criminals, and hackers. On the other side are the traditional,
organized, institutional powers such as governments and large
multinational corporations. During its early days, the Internet gave
coordination and efficiency to the powerless. It made them powerful, and
seem unbeatable. But now the more traditional institutional powers are
winning, and winning big. How these two fare long-term, and the fate of
the majority of us that don’t fall into either group, is an open
question – and one vitally important to the future of the Internet.

In its early days, there was a lot of talk about the “natural laws of
the Internet” and how it would empower the masses, upend traditional
power blocks, and spread freedom throughout the world. The international
nature of the Internet made a mockery of national laws. Anonymity was
easy. Censorship was impossible. Police were clueless about cybercrime.
And bigger changes were inevitable. Digital cash would undermine
national sovereignty. Citizen journalism would undermine the media,
corporate PR, and political parties. Easy copying would destroy the
traditional movie and music industries. Web marketing would allow even
the smallest companies to compete against corporate giants. It really
would be a new world order.

Some of this did come to pass. The entertainment industries have been
transformed and are now more open to outsiders. Broadcast media has
changed, and some of the most influential people in the media have come
from the blogging world. There are new ways to run elections and
organize politically. Facebook and Twitter really did help topple
governments. But that was just one side of the Internet’s disruptive
character. Today, the traditional corporate and government power is
ascendant, and more powerful than ever.

On the corporate side, power is consolidating around both vendor-managed
user devices and large personal-data aggregators. It’s a result of two
current trends in computing. First, the rise of cloud computing means
that we no longer have control of our data. Our e-mail, photos,
calendar, address book, messages, and documents are on servers belonging
to Google, Apple, Microsoft, Facebook, and so on. And second, the rise
of vendor-managed platforms means that we no longer have control of our
computing devices. We’re increasingly accessing our data using iPhones,
iPads, Android phones, Kindles, ChromeBooks, and so on. Even Windows 8
and Apple’s Mountain Lion are heading in the direction of less user control.

I have previously called this model of computing feudal. Users pledge
allegiance to more powerful companies who, in turn, promise to protect
them from both sysadmin duties and security threats. It’s a metaphor
that’s rich in history and in fiction, and a model that’s increasingly
permeating computing today.

Feudal security consolidates power in the hands of the few. These
companies act in their own self-interest. They use their relationship
with us to increase their profits, sometimes at our expense. They act
arbitrarily. They make mistakes. They’re deliberately changing social
norms. Medieval feudalism gave the lords vast powers over the landless
peasants; we’re seeing the same thing on the Internet.

It’s not all bad, of course. Medieval feudalism was a response to a
dangerous world, and depended on hierarchical relationships with
obligations in both directions. We – especially those of us who are not
technical – like the convenience, redundancy, portability, automation,
and shareability of vendor-managed devices. We like cloud backup. We
like automatic updates. We like it that Facebook just works – from any
device, anywhere.

Government power is also increasing on the Internet. Long gone are the
days of an Internet without borders, and governments are better able to
use the four technologies of social control: surveillance, censorship,
propaganda, and use control. There’s a growing “cyber sovereignty”
movement that totalitarian governments are embracing to give them more
control – a change the US opposes, because it has substantial control
under the current system. And the cyberwar arms race is in full swing,
further consolidating government power.

In many cases, the interests of corporate and government power are
aligning. Both corporations and governments want ubiquitous
surveillance, and the NSA is using Google, Facebook, Verizon, and others
to get access to data it couldn’t otherwise. The entertainment industry
is looking to governments to enforce their antiquated business models.
Commercial security equipment from companies like BlueCoat and Sophos is
being used by oppressive governments to surveil and censor their
citizens. The same facial recognition technology that Disney uses in its
theme parks also identifies protesters in China and Occupy Wall Street
activists in New York.

What happened? How, in those early Internet years, did we get the future
so wrong?

The truth is that technology magnifies power in general, but the rates
of adoption are different. The unorganized, the distributed, the
marginal, the dissidents, the powerless, the criminal: they can make use
of new technologies faster. And when those groups discovered the
Internet, suddenly they had power. But when the already powerful big
institutions finally figured out how to harness the Internet for their
needs, they had more power to magnify. That’s the difference: the
distributed were more nimble and were quicker to make use of their new
power, while the institutional were slower but were able to use their
power more effectively. So while the Syrian dissidents used Facebook to
organize, the Syrian government used Facebook to identify dissidents.

All isn’t lost for distributed power, though. For institutional power
the Internet is a change in degree, but for distributed power it’s a
change of kind. The Internet gives decentralized groups – for the first
time – access to coordination. This can be incredibly empowering, as we
saw in the SOPA/PIPA debate, Gezi, and Brazil. It can invert power
dynamics, even in the presence of surveillance censorship and use control.

There’s another more subtle trend, one I discuss in my book Liars and
Outliers. If you think of security as an arms race between attackers and
defenders, technological advances – firearms, fingerprint
identification, lockpicks, the radio – give one side or the other a
temporary advantage. But most of the time, a new technology benefits the
attackers first.

We saw this in the early days of the Internet. As soon as the Internet
started being used for commerce, a new breed of cybercriminal emerged,
immediately able to take advantage of the new technology. It took police
a decade to catch up. And we saw it on social media, as political
dissidents made quicker use of its organizational powers before
totalitarian regimes were able to use it effectively as a surveillance
and propaganda tool. The distributed are not hindered by bureaucracy,
and sometimes not by laws or ethics. They can evolve faster.

This delay is what I call a “security gap”. It’s greater when there’s
more technology, and in times of rapid technological change. And since
our world is one in which there’s more technology than ever before, and
a greater rate of technological change than ever before, we should
expect to see a greater security gap than ever before. In other words,
there will be an increasing time period where the nimble distributed
power can make use of new technologies before the slow institutional
power can make better use of those technologies.

It’s quick vs. strong. To return to medieval metaphors, you can think of
a nimble distributed power – whether marginal, dissident, or criminal –
as Robin Hood. And you can think of ponderous institutional power – both
government and corporate – as the Sheriff of Nottingham.

So who wins? Which type of power dominates in the coming decades?

Right now, it looks like institutional power. Ubiquitous surveillance
means that it’s easier for the government to round up dissidents than it
is for the dissidents to anonymously organize. Data monitoring means it
is easier for the Great Firewall of China to block data than it is to
circumvent it. And as easy as it is to circumvent copy protection
schemes, most users can’t do it.

This is largely because leveraging power on the Internet requires
technical expertise, and most distributed power groups don’t have that
expertise. Those with sufficient technical ability will be able to stay
ahead of institutional power. Whether it’s setting up your own e-mail
server, effectively using encryption and anonymity tools, or breaking
copy protection, there will always be technologies that are one step
ahead of institutional power. This is why cybercrime is still pervasive,
even as institutional power increases, and why organizations like
Anonymous are still a social and political force. If technology
continues to advance – and there’s no reason to believe it won’t – there
will always be a security gap in which technically savvy Robin Hoods can
operate.

My main concern is for the rest of us: everyone in the middle. These are
people who don’t have the technical ability to evade either the large
governments and corporations that are controlling our Internet use, or
the criminal and hacker groups who prey on us. These are the people who
accept the default configuration options, arbitrary terms of service,
NSA-installed back doors, and the occasional complete loss of their
data. In the feudal world, these are the hapless peasants. And it’s even
worse when the feudal lords – or any powers – fight each other.

As anyone watching Game of Thrones knows, peasants get trampled when
powers fight: when Facebook, Google, Apple, and Amazon fight it out in
the market; when the US, EU, China, and Russia fight it out in
geopolitics; or when it’s the US vs. the terrorists or China vs. its
dissidents. The abuse will only get worse as technology continues to
advance. In the battle between institutional power and distributed
power, more technology means more damage. Cybercriminals can rob more
people more quickly than criminals who have to physically visit everyone
they rob. Digital pirates can make more copies of more things much more
quickly than their analog forebears. And 3D printers mean that the data
use restriction debate now involves guns, not movies. It’s the same
problem as the “weapons of mass destruction” fear: terrorists with
nuclear or biological weapons can do a lot more damage than terrorists
with conventional explosives.

It’s a numbers game. Very broadly, assume there’s a particular crime
rate society is willing to tolerate. With historically inefficient
criminals, we were willing to live with some percentage of criminals in
our society. As technology makes each individual criminal more powerful,
the percentage we can tolerate decreases. This is essentially the
“weapons of mass destruction” debate: as the amount of damage each
individual terrorist can do increases, we need to do increasingly more
to prevent even a single terrorist success.

The more destabilizing the technologies, the greater the rhetoric of
fear, and the stronger institutional power will get. This means even
more repressive security measures, even if the security gap means that
such measures are increasingly ineffective. And it will squeeze the
peasants in the middle even more.

Without the protection of feudal lords, we’re subject to abuse by
criminals and other feudal lords. Also, there are often no other options
but to align with someone. But both these corporations and the
government – and sometimes the two in cahoots – are using their power to
their own advantage, trampling on our rights in the process. And without
the technical savvy to become Robin Hoods ourselves, we have no recourse
but to submit to whatever institutional power wants.

So what happens as technology increases? Is a police state the only
effective way to control distributed power and keep our society safe? Or
do the fringe elements inevitably destroy society as technology
increases their power? Probably neither doomsday scenario will come to
pass, but figuring out a stable middle ground is hard. These questions
are complicated, and dependent on future technological advances that we
cannot predict. But they are primarily political questions, and any
solutions will be political.

In the short term, we need more transparency and oversight. The more we
know of what institutional powers are doing, the more we can trust that
they are not abusing their authority. We have long known this to be true
in government, but we have increasingly ignored it in our fear of
terrorism and other modern threats. This is also true for corporate
power. Unfortunately, market dynamics will not necessarily force
corporations to be transparent; we need laws to do that. The same is
true for decentralized power; transparency is how we will differentiate
political dissidents from criminal organizations.

Oversight is also critically important, and is another long-understood
mechanism for checking power. This can be a combination of things:
courts that act as third-party advocates for the rule of law rather than
rubber-stamp organizations, legislatures that understand the
technologies and how they affect power balances, and vibrant
public-sector press and watchdog groups that analyze and debate the
actions of those wielding power.

Transparency and oversight give us the confidence to trust institutional
powers to fight the bad side of distributed power, while still allowing
the good side to flourish. For if we are going to entrust our security
to institutional powers, we need to know they will act in our interests
and not abuse that power. Otherwise, democracy fails.

In the longer term, we need to work to reduce power differences. The key
to all of this is access to data. On the Internet, data is power. To the
extent the powerless have access to it, they gain in power. To the
extent that the already powerful have access to it, they further
consolidate their power. As we look to reducing power imbalances, we
have to look at data: data privacy for individuals, mandatory disclosure
laws for corporations, and open government laws.

Medieval feudalism evolved into a more balanced relationship in which
lords had responsibilities as well as rights. Today’s Internet feudalism
is both ad-hoc and one-sided. Those in power have a lot of rights, but
increasingly few responsibilities or limits. We need to rebalance this
relationship. In medieval Europe, the rise of the centralized state and
the rule of law provided the stability that feudalism lacked. The Magna
Carta first forced responsibilities on governments and put humans on the
long road toward government by the people and for the people. In
addition to re-reigning in government power, we need similar
restrictions on corporate power: a new Magna Carta focused on the
institutions that abuse power in the 21st century.

Today’s Internet is a fortuitous accident: a combination of an initial
lack of commercial interests, government benign neglect, military
requirements for survivability and resilience, and computer engineers
building open systems that worked simply and easily. Corporations have
turned the Internet into an enormous revenue generator, and they’re not
going to back down easily. Neither will governments, which have
harnessed the Internet for political control.

We’re at the beginning of some critical debates about the future of the
Internet: the proper role of law enforcement, the character of
ubiquitous surveillance, the collection and retention of our entire
life’s history, how automatic algorithms should judge us, government
control over the Internet, cyberwar rules of engagement, national
sovereignty on the Internet, limitations on the power of corporations
over our data, the ramifications of information consumerism, and so on.

This won’t be an easy period for us as we try to work these issues out.
Historically, no shift in power has ever been easy. Corporations have
turned our personal data into an enormous revenue generator, and they’re
not going to back down. Neither will governments, who have harnessed
that same data for their own purposes. But we have a duty to tackle this
problem.

Data is the pollution problem of the information age. All computer
processes produce it. It stays around. How we deal with it -- how we
reuse and recycle it, who has access to it, how we dispose of it, and
what laws regulate it -- is central to how the information age
functions. And I believe that just as we look back at the early decades
of the industrial age and wonder how society could ignore pollution in
their rush to build an industrial world, our grandchildren will look
back at us during these early decades of the information age and judge
us on how we dealt with the rebalancing of power resulting from all this
new data.

I can’t tell you what the result will be. These are all complicated
issues, and require meaningful debate, international cooperation, and
innovative solutions. We need to decide on the proper balance between
institutional and decentralized power, and how to build tools that
amplify what is good in each while suppressing the bad.

-------


-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: OpenPGP digital signature
URL: <http://cpunks.org/pipermail/cypherpunks/attachments/20160721/a492f322/attachment-0001.sig>


More information about the cypherpunks mailing list