an ominous comment

Zenaan Harkness zen at freedbms.net
Mon Jul 20 16:56:30 PDT 2015


On 7/20/15, Stephen D. Williams <sdw at lig.net> wrote:
> On the other hand, life is a balance.

True. I'm thinking individuals here.

> I probably shouldn't have tried to
> make the point here, but it is something a security
> professional should understand well: The right amount of security
> should be moderated by the tradeoff of costs vs. overhead vs.
> maximizing benefit vs. minimizing loss.

Corporations are bound to their economic imperative to make such trade
offs. This is the heart of their sociopathic nature. This is the part
of corporations/ companies which needs, somehow, to change in order to
get this world on a better track.

...
> It is terrible that some companies have been too eager to share information.
>  They may or may not have believed whatever safeguards
> were in place, or not cared, etc.  I'm sure a high pressure meeting with an
> FBI crew who are strongly playing the terrorism angle is
> persuasive, as it should be, up to a point.

Here's the kind of talk that looks like a hole freshly dug.

Perhaps if there is an actual existential threat to someone's life or
some building (let's please stop using the T word), then "high
pressure persuasion" would be adequate for a court order anyway. As it
should be - up to the point of a subpoena, summons and/ or order to
perform or act - to handle the actual problem.

You seem though to be normalising behaviours and approaches and "high
pressure persuasion" tactics by government departments, in a
generalised way. You might not be intending the things you imply/ say,
but don't be surprised when such positions are mocked or ridiculed.
Don't take such blow back as personal at all though - it's the
"normalisation of bad" and "plainly wrong/ evil" which is being
attacked for the bullshit it is.


> And companies holding your data
> can actually look at that data for business purposes,

Perhaps try something this instead: "And for-profit therefore
sociopathic-by-nature companies do massively collect your metadata AND
your personal information, with or without your consent, and are well
leaked and reported to use and abuse all your data both within and
beyond the law, beyond your expectations, and beyond what many people
consider ethical."

See what we did there? We made it personal, giving a slight hope to
the uninitiated to realise something they did not realise before. We
highlighted some foundations (for profit being inherently
sociopathic). We reminded the reader that their consent is often not
obtained (yes, we can argue about implied consent, the point is we're
edumacating). We make the assertion that companies actually abuse all
that data (whatever "abuse" might mean), just in case someone missed
the memo.

With all this, we are also implying that this abuse is wrong.

Your version sounds like you are -trying- to normalise the wrong,
justify the bad, and 'accept the new messed up world order as best we
can'. We hear enough of that from others. And I saw NO to that abuse!
Give me justification for abuse, at your peril!


> although how they use it is somewhat bounded by privacy laws (however
> incomplete), not making private things public, unfair business
> practices, etc.  My point was that the existence of large, valuable services
> that depend on a lot of trust is, or should be to a

"should be" trustworthy?

They're companies. You've missed the bloody memo. And a very bloody
memo the corporate record is, for decades and across industries!

> sane entity, an even stronger incentive to behave than the patchwork
> of laws.

You're not grokking the incentive. It's profit. And it's more than an
incentive, profit is the foundational company-constitutional
imperative for companies (funny that).

This is why companies can NOT be trusted. You seem to be missing this
basic point. Do you own a company?


> Past oversharing, then embarrassment and public
> abuse, coupled with product impacts as they lose sensitive customers, has
> almost certainly caused a cleanup of those attitudes.  I'd
> be interested in the actual policy right now, although I doubt they are
> going to be too explicit.  I suspect that it also varies
> heavily by corporate culture.

Some companies start with good policy, and good public stance, most
significantly in this conversation, Google itself - "do no evil". They
don't say that any more. They can't. Did you ever wonder why they
stopped saying that?


> Every day, you are somewhat at the mercy of dozens and perhaps thousands
> of people who could cause you pain, suffering, or death if
> they were so inclined.  There are many in the government, schools, employer
> personnel departments, medical and insurance companies,
> etc.  The people driving around you, stopped at a light while you cross the
> street, making your food, they all have access and the
> ability to inflict misery on you.  You have to trust someone to some extent.

Trust is a relevant foundation to community/ society, sure.

But now you've segued into personal. Which is a good place at times,
an effective place. It's more tangible for people.

But here we were talking about companies. I would ordinarily presume
your trust formula is different for companies that it is for actual,
you know, humans.

I suggest not overloading corporate rights, corporate trust, with
human rights, human trust. Not particularly useful in our context.


> The question is who you trust, how incentivized they
> and the people / organization around them protects you, whether wrongs will
> be limited, corrected, and righted or not.

A rational approach is warranted for sure.

Companies, and in most cases humans working for them, are
predominantly incentivized by money. Yesterday I read an article on
the Great Wall of China. Incredible vision, so many centuries of
building. But when it came down to the time it was 'needed', due to
there being only so many sentries, and so far spread out, and the
sentries paid so little, when the marauding Mongols wanted in, to do
some marauding, they just bribed a sentry or two. Apparently same with
the Europeans in more recent times. So, incentivized people were,
secure, wall was not. The biggest security theater.

I think the great wall may have been useful psychologically though...
to encourage a mindset of unity in the people within.


> For a long time, as a contractor at the peak of their heyday, I had access
> to AOL's entire user database, complete with name,
> address, full credit card info, phone numbers, etc.  I could have also
> snooped on their Buddylists, their person-to-person video
> (Instant Images), and a lot more.  There was zero chance that I would abuse
> any of that.

Your ethics are admirable. I share your personal intentions. I don't
trust companies though, except to plunder markets to the maximum
profit possible.

Zenaan

> sdw
>
> On 7/20/15 2:07 PM, Juan wrote:
>>
>> 	cypherpunk :
>>
>> 	https://www.wikileaks.org/Op-ed-Google-and-the-NSA-Who-s.html
>>
>> 	"Google and the NSA: Who’s holding the ‘shit-bag’ now?"
>>
>>
>> 	Not-cypherpunk-at-all :
>>
>>
>>> 2015-07-19 2:22 GMT+09:00 Stephen D. Williams <sdw at lig.net>:
>>>
>>> I feel perfectly confident that Google is going to protect their
>>> billions in income and valuation by being very careful with
>>> avoiding abusing their data or users in any strong sense.




More information about the cypherpunks mailing list