Fwd: Call for input to President's Commission on Enhancing Cybersecurity

Steve Kinney admin at pilobilus.net
Tue Jul 19 19:22:52 PDT 2016


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Maybe I will finally qualify for the title Statist Pig with this post.
 One can only hope.

Contrary to the original query's request that replies not consist of
"don't do dumb things," I have composed a very elaborate text that
actually says, "don't do dumb things."

:o)

On 07/19/2016 10:13 PM, Steve Kinney wrote:
> On 07/16/2016 06:22 PM, Joy wrote:
>> - - - Begin forwarded message - - -
> 
>> Date: July 15, 2016 at 3:21:32 PM EDT From: Herb Lin 
>> <herblin at stanford.edu> To: "'David Farber (dave at farber.net)'" 
>> <dave at farber.net>, ip <ip at listbox.com> Subject: Call for input
>> to President's Commission on Enhancing Cybersecurity - bridging
>> the trust gap between the IT community and the US government
> 
>> You may know that President Obama has established a commission
>> to consider how to strengthen cybersecurity in both the public
>> and private sectors while protecting privacy, ensuring public
>> safety and economic and national security, fostering discovery
>> and development of new technical solutions, and bolstering
>> partnerships between Federal, State, and local government and the
>> private sector in the development, promotion, and use of
>> cybersecurity technologies, policies, and best practices.
> 
> The mission defined above is much more ambitious than it may
> initially appear, because direct conflicts of interest are hard
> wired into it. The "cybersecurity" buzzword embraces a spectrum of
> practical security context from protecting consumer financial
> credentials through shielding "secret" government databases from
> unauthorized access, to preventing malicious alteration of the
> firmware that runs our civil and industrial infrastructure.
> Privacy is not encompassed by the term "cybersecurity" as it is
> intended and understood by those who use presently use it in a
> national policy context - but this can and should change.
> 
> Network security addresses practical concerns of privacy, utility, 
> reliability and cost effectiveness as well as countermeasures to 
> stereotypical hacker threats.  The express inclusion of privacy 
> protection in its brief directs the Commission to deliver 
> recommendations directly counter to the interests of private 
> enterprises and government departments which presently collect, 
> analyze, and transfer or act on "private" information about 
> individuals and groups.  If economic security is taken to include 
> protecting the revenue streams of dominant U.S. IT vendors and
> their associated armies of specialized workers in the field,
> either "cybersecurity" or economic security must be sacrificed.  If
> national security is taken to include protecting intelligence
> service access to surveillance and sabotage targets via widely
> distributed security defects in IT products and services, either
> "cybersecurity" or national security must be sacrificed.
> 
> A security model can not be "just a little bit pregnant."  Every 
> variance or exception that permits violations of any system's 
> specified security protocol creates new vulnerabilities that 
> compromise the security of that system, usually in subtle as well
> as obvious ways. Security threats are both external and internal to
> the enterprise, and include hackers who want to break in for fun
> and/or profit, but also:  Enterprise IT consumers who who make
> non-negotiable demands for features and functions that create
> security vulnerabilities; senior executives whose golfing buddies
> know more about network security than the enterprise's entire IT
> staff; IT vendors who are free to hide deficiencies and
> misrepresent their wares under immunity from prosecution or civil
> liability; academics and consultants whose personal fortunes rise
> and fall with the value of vendor-specific credentials; and
> certified technical workforces whose educational and occupational
> background is restricted to a vendor specific context, and includes
> mandatory training as outside sales reps for those same vendors.
> Add to this the massive political influence of dominant U.S. IT
> vendors' senior executives and major shareholders, and our picture
> of an industry hard wired for security failure is complete.  The
> expected end result of the complex of counter-security factors
> outlined above would be smoking rubble, and that is and apt
> description of prevailing network security conditions.
> 
> Pervasive "cybersecurity" failures have prompted the Executive
> branch to prepare for intervention across both government and
> private sector domains; in itself this is evidence that a deep
> systemic disorder harmful to the National Interest has been
> recognized and acknowledged. Developing a functional model that
> explains why an emergency exists is the first step toward reliably
> and durably ending it.  The inclusive nature of the Commission's
> mandate requires it to address "cybersecurity" in a holistic
> manner.  The systemic disorders listed above are inherent in the
> present economic and political relationships of parties whose
> inputs control IT security across all domains.
> 
> Effective solutions will be called "radical" and rightly so, as
> one must change the underlying economic and political relationships
> that drive the ongoing failure of "cybersecurity" to achieve
> meaningful results.  Bolting layers of external reinforcement onto
> a broken machine does not fix the machine, only prolongs its
> ability to produce broken outputs.  Like an urban renewal project,
> implementing an effective national "cybersecurity" strategy begins
> with a wrecking ball.  If this is not an acceptable option,
> "enhanced cybersecurity" is not a possible outcome.
> 
>> Recognizing that trust is hard to build and easy to destroy (and
>> a variety of things have happened over the last 20 years have 
>> occurred to do the latter), one issue that has come up is the 
>> enormous gap of trust between the U.S. government and the 
>> information technology (IT) community, from which many IPers are 
>> drawn.  This rift is not helpful to either side, and I'd like to 
>> solicit input from the IP community about what you think the 
>> government can do or refrain from doing to help bridge that gap.
> 
> In the present context, trust has two distinct and nearly opposite 
> definitions:  In a political context, trust means confidence 
> cultivated to further a collaborative and/or manipulative agenda.
> In a network security context, trust is a controlled asset whose
> role is minimized on every front and excluded where and as
> possible:  A trusted actor or system is one that can break your
> security model.
> 
> A competent IT security strategy compartmentalizes, simplifies and 
> hardens the handling of protected assets.  Tools must be selected
> and protocols designed on a case by case basis to enable a given 
> enterprise or department's necessary functions while minimizing 
> exposure of its assets to hostile actors.  Trust is rationed, and 
> dispensed only where and as the benefits of trust outweigh the
> risks.
> 
> As a simple example illustrating the role of trust in
> "cybersecurity," all major web browsers automatically download and
> execute software as directed by any website their users visit,
> without the user's knowledge or express consent.  Large families of
> critical security vulnerabilities grow from this promiscuous trust
> model.  Many botnets propagate themselves via this vector, which
> has also enabled targeted attacks compromising "secured" assets
> affecting major corporations and government agencies.  Browser
> makers build automatic execution of 3rd party software into their
> browsers because both end users and major commercial website
> operators demand it.  Vendor efforts to mitigate this critical
> security threat by 'sanitizing and sandboxing' incoming executable
> code can reduce but not reliably prevent high impact security
> incidents arising from a fundamentally insecure trust model.
> 
> Informed end users can install tools like NoScript which prevent
> the browser from downloading and executing software without the
> user's express consent.  Individual websites can be whitelisted by
> the user, where and as the benefits of automatically executing
> arbitrary software from a given site are believed to outweigh the
> risks of doing so.  In practice this trivially simple trust based
> security measure has proven itself orders of magnitude more
> effective than a promiscuous trust model "mitigated" by complex,
> failure prone defenses against hostile code.
> 
> Bridging the trust gap between the IT community and the US
> government is already a done deal, because there has never been
> one.  The U.S. government funded and directed the creation of the
> IT industry.  As indicated above, the existing bridges enabling
> public/private partnership in IT enterprises create promiscuous
> trust relationships in the face of conflicts of interest, perverse
> incentives and institutional inertia all working against the
> objectives of "cybersecurity."  Rather than reinforcing them, these
> bridges must be locked down or removed as the first step toward
> enhancing "cybersecurity."
> 
>> 1 - Your best examples of things the government (and what part
>> of the US government) has done to alienate the IT community 
>> specifically. (Or, at the very least, show how the examples you 
>> provide connect to the interests of the IT community.)
> 
> The U.S. government has not alienated the IT community:  It has 
> shielded this community from liability for fraudulent performance 
> claims, fed it billions of dollars of annual revenue, and given 
> Fortune 500 IT corporations nearly full control of government
> policy affecting those same corporations.  The intimate partnership
> of IT vendors and government decision makers has, however,
> alienated a large segment of the public at large.  With regard to
> privacy concerns, IT vendors are correctly perceived as the
> government's partners in domestic mass surveillance.  The interests
> of the IT community are directly served by the government's nearly
> absolute tolerance for commercial mass surveillance, inherently
> insecure products and protocols, forced obsolescence strategies and
> abusive marketing practices - all of which are routinely
> implemented by major IT vendors to reduce costs and/or enhance
> revenues.
> 
> The current condition of gross insecurity across private and State 
> owned IT assets is a product of the dominant role of vendors who
> are richly rewarded for exploiting the technological ignorance of
> private and public sector decision makers.  The cumulative cost of
> unstable, insecure IT infrastructure supplied and serviced by
> parasitic vendors greatly exceeds the short term costs of
> replacement with stable, security oriented infrastructure; but
> perverse incentives and conflicts of interest assure that no such
> course can be taken absent dynamic and determined public sector
> leadership.
> 
>> 2 - Things that the U.S. government could realistically do in
>> the short and medium term (i.e., 0-10 year time frame) that would
>> help bridge the trust gap.  If your answer is "Don't do dumb
>> things!", it would be better and more useful to provide
>> *examples* of what not to do.
> 
> Revoke software vendors' blanket immunity from prosecution for 
> consumer fraud and from liability for damage caused by failure to 
> control product defects.  Where there is no accountability, there
> is no motivation to spend money on security and no rational basis
> for consumer trust.  The infallible invisible hand of the Free
> Market can not produce security, quality or innovation where the
> State grants special immunity from prosecution and civil liability
> to privileged parties.
> 
> Mandate security evaluations based on performance and design
> metrics for all software (and firmware) purchased for use by
> government agencies and departments.  These evaluations must
> include examination of the specific product offerings under
> consideration, and the bidder's historical security track record
> across all products.  Total cost of ownership calculations for IT
> assets must include estimated costs of potential security failures,
> and projected costs of recovery from same, proportionally adjusted
> to reflect the relative security performance of each competing
> bidder's products.  This could be facilitated by the establishment
> of a transparent, accountable Federal activity that collects
> relevant data and produces reports in a standardized format
> consistent with government procurement process.
> 
> Mandate reporting of security incidents by every government
> activity, and every commercial enterprise with a State or Federal
> tax ID, where financial losses and costs of remediation and
> recovery from the incident exceed $5,000.00.  Require reporting of
> the category of failure, specific software tools that presented the
> vulnerabilities exploited, direct losses incurred, and the costs of
> remedial and recovery measures taken.  Specify that aggregate data
> from these reports be made available to the public on at least a
> quarterly basis.
> 
> Direct the Federal Communication Commission to conduct and
> annually review studies on the privacy impacts, positive and
> negative, of deployed and proposed network communication protocols
> and Standards, publicly report their findings, and solicit public
> comments in a transparent process.  Mandate that all reports
> reference IETF RFC 6973, Privacy Considerations, as guidance in
> identifying, naming and evaluating adverse and beneficial privacy
> impacts of deployed and proposed network communication protocols
> and architectures.
> 
>> 3 - Things that the U.S. government could realistically do in
>> the longer term to do the same.
> 
> See above.  A durable commitment of all necessary resources to
> assure that the measures suggested in response to query 2 are
> effectively implemented would create and sustain rational,
> constrained trust relationships affecting all those aspects of
> "cybersecurity" which are properly the government's business.
> 
> The requirement that recommendations be "realistic" is
> regrettable. "Practicable" would have been better language.  A
> realistic proposal might be considered as one that will not provoke
> a do-or-die defense of the status quo from dominant IT vendors,
> U.S. intelligence activities, and others whose bread and butter is
> "cyber insecurity."
> 
> A practicable proposal would be one that is within the scope of
> public policy authorities and industry capabilities:  Vendors who
> assert that requirements are "impossible" or simply refuse to
> comply will be replaced by vendors who are ready to step forward
> and meet any challenges presented.  Solutions to many of today's
> most serious and widespread network security failures are already
> avaialbe as off the shelf products from vendors with excellent
> security track records. The proposals presented under query 3 above
> may not be considered reasonable by dominant industry stakeholders,
> but they are practicable, and these or materially similar policy
> initiatives are necessary if the President is serious about getting
> the results he has asked for.
> 
> Steve Kinney
> 
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.22 (GNU/Linux)

iQEcBAEBAgAGBQJXjuB8AAoJEECU6c5Xzmuq3a0H/irlhwCoGeBc9QFjnIT3OvPg
AFDw/rw+NLwI7GrJyMyr+Sz4gxsC3CXGy5UuasLHtzDlN7nKN2kzvAYl3lj2TihD
it8aoQ5C2oK5hGu/Vz12hjuH9DJxOHr1ctlACpyTBeIw5MwLJFap/MMi8Q76z/ZA
7X4tQLGeMCkQeHXS3wSmYTcMv1/Zg+LaYiTJ/Jnuy9hMlGDiv7mTGOEWsrPKZ4bT
t1h+cFRKK+yPD99mYT3qc68jFlVGEas76IhQFNbXwUl8/N+eKRvXTJJpiWZtWbRN
Xm8rZWr5aLnI2RNMiIPfn2YzhUQimQCVE1HnDn6yHMQg98DYzlqRYsBMWvTNIqM=
=Mo7H
-----END PGP SIGNATURE-----



More information about the cypherpunks mailing list