Tox.im

Markus Ottela oottela at cs.helsinki.fi
Tue Feb 3 14:54:07 PST 2015


On 03.02.2015 23:06, rysiek wrote:
> Dnia wtorek, 3 lutego 2015 21:52:34 piszesz:
>
>
> True. But the state of affairs right now is that people are massively using
> Skype. So even not-so-well implemented free-software crypto peer-to-peer
> audio-video and IM app is a step-up (as long as it's not being sold as end-
> all-problems-heal-your-dog-panaceum).
>
> And I would not call Tox snakeoil mainly because snakeoil salesmen *ignore*
> criticism and *willfully and knowingly* sell bullshit; Tox is at least
> *trying* to get things working and properly implemented, as far as I can see.
>
> So there's a huge difference in (perceived? apparent? true?) intentions.
They are ignoring the criticism they should be warning users about 
constant issues in endpoint security: Subrosa, Ricochet, TextSecure, 
Cryptocat and Threema have all included a threat model/warning, Tox 
should do so too. Notifying users about risks is what keeps them safe, 
not moving to slightly more secure products they assume are 
impenetrable. Conscious ignoring of this on the developers part equals 
selling "bullshit".

Lets assume they put the warning on the web page. Now every user who 
reads the security warning begins to think "Ok, so given my contacts, 
reputation and opsec, my private key is compromised with probability P. 
Am I still going to write this or do I upgrade my tools? Am I under 
constant monitoring? Do I need to regenerate my keys?". In the beginning 
of Citizenfour, Snowden gives a warning to Poitras about private keys: 
even though PGP encrypts private key at rest. After that, Laura bought 
an airgapped machine and created new PGP keypair.

>> I'm not sure it's a convenient thing users have to re-add their contacts
>> every time the DH signing key needs to be refreshed. It's sort of good
>> thing users are immediately using the public signing key (Tox ID) but
>> the issue is, while the Tox ID doesn't have to be secret, it must be
>> authentic: so users unaware of this can be subjected to MITM attack.
> Yes. But now we're discussing the proto and the implementation, so I assume we
> moved forward from the "is it snakeoil" question. At least I hope so.
>
Again, security is a process, not a product: unless the implementation 
of crypto is secure and users know how to use it, properly written 
Salsa20 implementation isn't going to do much good. Writing a good 
manual is the responsibility of the developer.

>>>> *7. Neglects general sad state of host security *
>>> Well, yes, and my beef with Tox is also that the private keys do not
>>> require a passpharse to unlock. So that's a no-no in my book.
>> This only changes the type of attack: a keylogger has to be used along
>> the private key exfiltration tool.
> "Using seatbelts only means that the type of the car accident has to change:
>   faster and with flying debris."
>
> I'll take the seatbelts, though. I'm fine with making the attacker spend a bit
> more time and resources if they want to get me. There are no bulletproof
> solutions anyway.
Here's a Metasploit payload Meterpreter. How hard do you think it's for 
me to automate the two Armitage GUI functionalities of browsing files 
and logging keystrokes once I buy a 0-day from Vupen with tax money?

https://4.bp.blogspot.com/-9SL6twrYlLg/UcKHmH8QkyI/AAAAAAAAALg/GogP6DN4KIs/s1600/35.JPG

Now think about Fox Acid, Metasploit, with a budget. Then think of 
things like Quantuminsert that automate this process on mass scale. Your 
seat belt is a bad analogy. There are no bulletproof solutions but there 
are better ones.

>>> Still, this doesn't look like snakeoil; rather like a good idea with
>>> not-so- stellar execution, which *might* get better.
>>>
>>> Am I missing anything?
>> I would argue the current OTR/PGP/ZRTP implementation has limited
>> lifespan regardless of execution, given the fact intelligence community
>> is expanding end-point exploitation to mass surveillance levels:
>> methodology is changing, not the scale:
>> https://www.youtube.com/watch?v=FScSpFZjFf0&t=37m35s
> And the point here is... what exactly? "Don't use encryption, because it
> *might* be broken one day?"
No, the point here is, don't put TCB on a computer that does networking. 
Why are you putting emphasis on the word 'might' when Snowden says NSA 
bypasses encryption *every day*:
https://www.youtube.com/watch?v=YxPKoXTKDc8#t=48m53s


> And that changes... what exactly? This affects *any and all* desktop-usable
> security solutions, so let's just assume that this is the baseline we have to
> work with and assess the solutions on their own merits, eh?
No, let's not assume. I've a small desk but it's still able to handle 
the three laptops in a configuration that does not have the issue.

The community has already accepted the host security as part of snake 
oil check. What on earth is the check doing here if we should accept OS 
vulnerabilities as a "baseline"? If the product isn't going to address 
it, it better not neglect it at least, Tox doesn't do even that.

I'm not trying to hijack this Tox discussion to say TFC is the solution. 
I'm trying to say it's pointless to create anything secure on a setup 
the features of which are limited(/rigged) to begin with. That's why 
smartphone is part of the snake oil checklist. The very first step says 
the product has to be FOSS, without free OS, no encryption software 
stands a chance. Without endpoint security, it's the same. The community 
is already praising $1,300 Novena laptops - I'm saying we can achieve 
higher security with set of three $200 COTS laptops and a few extra 
components.





More information about the cypherpunks mailing list