What advantage does Signal protocol have over basic public key encryption?

Karl gmkarl at gmail.com
Sun Jan 31 20:38:25 PST 2021


Hi, David.

Still crazy over here.

I disagree on your view of handling technically powerful surveillance
and compromise  ("feds").  It is my belief that many other people are
interested in handling this better, too, rather than discarding it,
especially with politics so polarised in the USA.

On 1/31/21, David Barrett <dbarrett at expensify.com> wrote:
> That.  In the real world, we can't all hand build and personally operate
> our own billion dollar fab to ensure atomic-level security of our entire
> vertical supply chain.  And even if you could... who's to say the Feds

No, but the closer we come to doing that the more potential issues are
handled by it.  I don't know a lot about threat modeling, but I do
understand that it is not black and white, but rather a spectrum of
gray areas with complex patterns.

> don't sneak in and swap your device with a perfect duplicate when you
> aren't looking?  Ultimately if you are trying to protect yourself from the
> combined might of, oh, 8 billion other people, you're going to have a tough
> time of it.  I'm not building for that use case (nor is anyone else).  I'm

People are indeed building for this usecase.  You say concern for
nation-state adversaries more and more present in open source
software.

> https://www.nitrokey.com/news/2020/nitropad-secure-laptop-unique-tamper-detection
>
>
> How do you know they aren't an NSA front?  Ultimately, you can't.  At some
> point you've got no choice but to trust someone.

This wasn't my message to reply, but in my opinion the fewer people
you need to trust, the easier it is to verify their trustworthiness.
Everyone is an NSA front so long as groups like the NSA can surveil
and pressure them.

- the nitropad is marketed to investigative journalists.  even
mentioning that investigative journalists _exist_ is a point of trust.
- the nitropad supports qubes.  qubes is a community-built operating
system that upped the public norms of security.
- the nitropad uses coreboot, and open source bootloader the community
has been clammering for more use of.  the boot-loader being open
source makes it much more dangerous to put backdoors into it.

I'm sure somebody with more familiarity with nitrokey could enumerate
many more points of trustworthiness compared to what you normally see.

> It would make sense to contribute or work with a project like Signal rather
>> than making a new messenger
>
>
> Well my job is to secure the privacy of Expensify's millions of users, not
> just shut down shop and tell them to use Signal (especially since Signal
> doesn't offer the wide range of preaccounting functionality we do).

That wasn't what I said, but I'm quite happy you contributed to
sqlite.  Signal also offers standalone libraries in Java, C, and
Javascript.  See near the bottom of https://signal.org/docs/ .  Signal
is also one in a huge ecosystem of messengers, any of which may have
done some of what you are trying to do already.

I don't really understand the role of messaging in expensify to have
ground to stand on here, of course.

> The only reasonable way to sell something on an app store is to distribute
>> a binary.  Meanwhile with the source available, people can build their
>> own
>> clients, and share them via other channels.
>
>
> I totally agree, no real world system can grow if it presumes everyone
> builds their own binaries (and presumably inspects the source code
> personally to make sure nothing was slipped in via Github when pulling down
> the repo).  My only point is real-world systems do not exist in a vacuum:
> the only way to realistically build a secure communication system used by
> billions is to rely upon trusting _the very people you don't want to
> monitor you_ to allow you to do so without interference.  It's a harsh,
> brutal reality, but there it is.

I think of things like app stores as more a way to get the app known
and used.  It is also only one avenue of compromise.  It only takes
one compromised person to compare the downloaded binary to identify
the compromise, and if they have a lawyer friend then things improve.

In communities where people get messed with, often there is a techy
person who can build things from source when relevant.

>> I visited expensify.cash but didn't notice an obvious link to the source
>> code.  It can be hard for me to see things, though.
>
> Ah, sorry!  It's a new project so we're mostly focused on curating a small
> set of dedicated contributors before going too wide.  But you can see it
> here: https://github.com/Expensify/Expensify.cash  -- Sign up at
> https://Expensify.cash and enter your Github handle, and we'll invite you
> to the various testflights and such.  We're also hiring freelancers to help
> out -- here are a bunch of open jobs, and we're adding more all the time:
>
> https://www.upwork.com/search/jobs/?q=Expensify%20React%20Native

Thanks for your welcoming reply.

> Thank you so much for your open source work.  Please work with existing
>> open source projects when you can both benefit from the work, so the
>> community can grow.
>
>
> I 100% agree.  We're major sponsors of SQLite, and have also open-sourced
> our BedrockDB.com, which was running a blockchain in production before it
> was cool: https://bedrockdb.com/blockchain.html
>
>
> Here is information on signal's reproducible builds:
>> https://github.com/signalapp/Signal-Android/tree/master/reproducible-builds
>> You actually can verify that the app from the play store is the one you
>> have the source to.
>
>
> Whoa, that's neat!  But doesn't change my point: unless everyone is doing
> this -- and doing it for every single upgrade -- it doesn't really matter.
> This is a neat parlor trick to create a sense of trust, but I think it's a

It's not really a parlor trick.  Developers have been working very,
very hard to make reproducible builds a thing.  Signal's approach is
very nascent.  It could be very easy to do this, the work just hasn't
gone in there yet.

Additionally, it really does help when only a single empowered person
does this.  With secure messaging, they can share the proof it
happened with everybody else.

> kind of disingenuous performance of security theatre: that's like shining a
> spotlight on Trump's 20 mile border wall while ignoring that it's a very
> incomplete protection.  Verifying the APK that you are getting from Google
> Play never actually makes sense:
>
> 1. If the feds *have* identified your device AND you are a sufficiently
> interesting person of interest that they would force Google to ship you a
> compromised APK -- they could also just force Google to ship you a
> compromised, invisibly-installed system update.  Verifying the APK doesn't
> prove anything.

I thought the system update is distributed by your phone's vendor, as
opposed to google.  I could be wrong.

Many people do not install system updates, or run their own ROMs.
These things can also be verified, themselves.

> 2. If the feds *have not* identified your device, or you are NOT
> sufficiently interesting to warrant being compromised, then verifying the
> APK won't ever find anything.

Far more avenues than just 'feds'.  Your build machine could have
malware that gets into the APK, or somebody could falsify your upload.
Google's servers could be compromised.  You could get personally
coerced or bribed to spread a compromised APK.  With reproducible
builds, it offers trust and transparency, that people can see clearly
what happens if something does.

> Whether you are or are not a target, if you are installing Signal via
> Google Play, you sorta have no choice but to assume that Signal and Google
> haven't been compromised: if you believe that, then there's no need to
> verify.  But if you do NOT believe that, then verifying it isn't nearly
> enough.  (Especially when no amount of securing your own device will secure
> the device of the person you are talking to -- who if you are a target, is
> probably also a target, or at least you shouldn't assume they aren't.)

There is a huge, extensive system and network path between your source
code, and the device of an end user.  Any point on that path can be a
point of compromise, given all the vulnerabilities we have seen over
the years.

I don't know a lot about threat modeling but I don't assume that the
feds are the only people doing these compromises.  I imagine anyone
with a lot of money could find somebody to do this.

Re insecure devices, secure messaging is kind of silly if the person
you want to be private from is remotely observing what your screen is
displaying.  Everything is a puzzle with many pieces.  We're
responsible for our pieces of that puzzle.


More information about the cypherpunks mailing list