Malicious, targeted, OS updates. How likely do you think it is?
A few days ago, I was thinking about ways to compromise even the most secure systems and I came across a fairly obvious way: through operating system updates. I admit that I am not up to date on the latest security research so please excuse me if this has been discussed before or is 'common knowledge'. What's stopping the FBI or other US law enforcement agency from compelling a US based operating system vendor, let's say Red Hat, from delivering a specialized update to a user that would allow the agency privileged and maybe even undetectable access to a target system? Since Red Hat has root on our systems, they could install whatever they want and most users wouldn't notice. For a company like Red Hat, it would be trivial since they know who you are as you are tied to your Red Hat subscription But this is by no means limited to them. Microsoft could do this too with a little more work. What are your thoughts? Am I crazy? Is this a 'well, we KNOW THAT already' moment that I am just catching up on? Thanks! Anthony -- Skype: cajuntechie XMPP/Jabber: papillion@dukgo.com PGP Key: 0xCC9D1E072AC97369 Validate My Key: https://keybase.io/cajuntechie Other Info: http://www.cajuntechie.org/p/my-pgp-key.html
Use FreeBSD, build from source ;) -- John
On Jan 18, 2017, at 2:15 PM, Anthony Papillion <anthony@cajuntechie.org> wrote:
A few days ago, I was thinking about ways to compromise even the most secure systems and I came across a fairly obvious way: through operating system updates. I admit that I am not up to date on the latest security research so please excuse me if this has been discussed before or is 'common knowledge'.
What's stopping the FBI or other US law enforcement agency from compelling a US based operating system vendor, let's say Red Hat, from delivering a specialized update to a user that would allow the agency privileged and maybe even undetectable access to a target system? Since Red Hat has root on our systems, they could install whatever they want and most users wouldn't notice. For a company like Red Hat, it would be trivial since they know who you are as you are tied to your Red Hat subscription But this is by no means limited to them. Microsoft could do this too with a little more work.
What are your thoughts? Am I crazy? Is this a 'well, we KNOW THAT already' moment that I am just catching up on?
Thanks! Anthony
-- Skype: cajuntechie XMPP/Jabber: papillion@dukgo.com PGP Key: 0xCC9D1E072AC97369 Validate My Key: https://keybase.io/cajuntechie Other Info: http://www.cajuntechie.org/p/my-pgp-key.html
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 01/18/2017 02:30 PM, John Newman wrote:
Use FreeBSD, build from source ;)
Security regression paradox: What's to prevent whoever might have replaced the binary in the repo - or replaced it in transit to you - from also rigging the source? So you have to audit the source. And the compiler that makes the source useable might have already been compromised, so audit its source and then... oops, compile the audited compiler using a potentially compromised compiler on a potentially compromised OS. This problem is no reason to just give up, but it does transform the security picture from a purely imaginary secure vs. insecure binary state, to an ecosystem of context-dependent compromise solutions. The costs of an "acceptable" security result depend on this question: What it is worth to an adversary to break your security model, vs. how much is preventing compromise of that asset worth to you? If an adversary spends less to successfully attack an asset than they gain by doing so, the adversary wins. If you spend more to successfully defend an asset than that asset is worth to you, you lose. This context provides a rational basis for allocating resources to security, but alas, it rules out absolute values or one size fits all solutions: Who are your potential adversaries, what motivates them, what resources are available to them? Who benefits from your security strategy, and what are they willing / able to pay - in additional work, constraints on their behavior, and cash money - to secure the assets in question? A security model that does not take these factors into accounts is a snake oil security model, regardless of the quality of the tools used. -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.22 (GNU/Linux) iQEcBAEBAgAGBQJYf9tUAAoJEECU6c5Xzmuq4lIIAMmjeyTeLr2kAvlBzbjO9ANq /S33clrbw+kK6UgfgxIMRGuG9mtEF8UPw/aZh0NBLE2498VdG8NNo+ghLqxfzwLe v5OXKeRDHPoOGslB0CP1TciIGSMxPS4v8YXGuM6AbgL0Eb7pE268MtdFt3xmX6ZV z5S0aVWToIqC7CJerjrOPunlvp6EfVWX5heOuBFWSISsYh0eZyH0id5QgJWLTShF awWi8O1BrbvlUEtWWLbnKvB5IWDAAU8/xl6tuuxtozk3ar3hcCNer9KYzjBHvPBx NBiCb9Chg1D0B41g8/VOmQTPQFNaA+mByJ+go4dhMLTYW+HzfMf585aLm6wAxrc= =PvlM -----END PGP SIGNATURE-----
On Jan 18, 2017, at 4:17 PM, Steve Kinney <admin@pilobilus.net> wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 01/18/2017 02:30 PM, John Newman wrote: Use FreeBSD, build from source ;)
Security regression paradox: What's to prevent whoever might have replaced the binary in the repo - or replaced it in transit to you - from also rigging the source? So you have to audit the source. And the compiler that makes the source useable might have already been compromised, so audit its source and then... oops, compile the audited compiler using a potentially compromised compiler on a potentially compromised OS.
lol i know, it becomes increasingly apparent how impossible a full audit of all the hardware and software that led to the software that is running your computer would be, even with a totally open source OS ;) Still, gotta take what you can get i guess..
This problem is no reason to just give up, but it does transform the security picture from a purely imaginary secure vs. insecure binary state, to an ecosystem of context-dependent compromise solutions.
The costs of an "acceptable" security result depend on this question: What it is worth to an adversary to break your security model, vs. how much is preventing compromise of that asset worth to you? If an adversary spends less to successfully attack an asset than they gain by doing so, the adversary wins. If you spend more to successfully defend an asset than that asset is worth to you, you lose.
This context provides a rational basis for allocating resources to security, but alas, it rules out absolute values or one size fits all solutions: Who are your potential adversaries, what motivates them, what resources are available to them? Who benefits from your security strategy, and what are they willing / able to pay - in additional work, constraints on their behavior, and cash money - to secure the assets in question? A security model that does not take these factors into accounts is a snake oil security model, regardless of the quality of the tools used.
-----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.22 (GNU/Linux)
iQEcBAEBAgAGBQJYf9tUAAoJEECU6c5Xzmuq4lIIAMmjeyTeLr2kAvlBzbjO9ANq /S33clrbw+kK6UgfgxIMRGuG9mtEF8UPw/aZh0NBLE2498VdG8NNo+ghLqxfzwLe v5OXKeRDHPoOGslB0CP1TciIGSMxPS4v8YXGuM6AbgL0Eb7pE268MtdFt3xmX6ZV z5S0aVWToIqC7CJerjrOPunlvp6EfVWX5heOuBFWSISsYh0eZyH0id5QgJWLTShF awWi8O1BrbvlUEtWWLbnKvB5IWDAAU8/xl6tuuxtozk3ar3hcCNer9KYzjBHvPBx NBiCb9Chg1D0B41g8/VOmQTPQFNaA+mByJ+go4dhMLTYW+HzfMf585aLm6wAxrc= =PvlM -----END PGP SIGNATURE-----
On 1/19/2017 8:59 AM, John Newman wrote:
lol i know, it becomes increasingly apparent how impossible a full audit of all the hardware and software that led to the software that is running your computer would be, even with a totally open source OS ;)
Well, of course, there is FORTH, the world's smallest operating system, development environment, assembler, compiler, and metacompiler. You start by programming the bootstrap in binary, and the bootstrap then assembles and compiles an increasingly powerful assembler, compiler, and metacompiler. Of course FORTH relies heavily on the developer to do tasks that are more suited to the operating system or the compiler, making it of limited value for programs larger than 32K or so and disks larger than ten megabytes or so, though Forth can theoretically address disks as large as two hundred and fifty six megabytes. Also file deletion in Forth is not really practical. But if you have enough Forth related stuff that you need to worry about deletion on a ten megabyte disk, your Forth environment is already too big and complex. But it would be not too difficult to have Forth compile a common lisp interpreter, the common lisp interpreter generate a C compiler, the C compiler generate a common lisp compiler and a C++ compiler, the C++ compiler and the Forth compiler generate an actually useful operating system, and there you are.
participants (5)
-
Anthony Papillion
-
James A. Donald
-
John Newman
-
stef
-
Steve Kinney