[liberationtech] Foxacid payload

Jonathan Wilkes jancsika at yahoo.com
Thu Jul 17 14:10:22 PDT 2014


On 07/17/2014 04:11 PM, coderman wrote:
> On Thu, Jul 17, 2014 at 12:41 PM, Andy Isaacson <adi at hexapodia.org> wrote:
>> ...
>>> this is exactly why some who have received these payloads are sitting
>>> on them, rather than disclosing.
>> Hmmm, that seems pretty antisocial and shortsighted.  While the pool of
>> bugs is large, it is finite.
> consider, having recv a payload:
> - who do i trust to evaluate / dissect?

This is the reason why I asked on this list and didn't attempt, for 
example, to try to capture the payload myself.  If competent security 
specialists do the baiting and the evaluating/dissection, it's a non-issue.

> - would it be better to fix a class of bugs than a specific instance?

Well if you mitigate privately, how do you go about luring additional 
0days out of FoxAcid's queue?  And even then how do you know you've 
reached the queue, or are sufficiently deep into it that releasing bug 
reports won't burn potentially valuable information?

But if you mean instead that the effort should be focused on redesigning 
browser security in some way to address a broader class of exploits of 
which FoxAcid is a part, how does keeping a payload private/unpatched 
further that goal?

> - what information would be lost if burned?
>
> looking over the JTRIG and TAO catalogs you see how pervasive
> vulnerabilities are in all our systems.  there is a process at work
> churning out weaponized exploits.

That's a great reason to focus more effort on browser security (and 
security in general), finding sustainable and diverse models of funding 
for such efforts, etc.  But I fail to see how that is relevant fixing a 
bug related to a specific payload.

However, your comments do lead me to believe some private mitigation 
must be happening as we speak.  There must be some valuable information 
being gleaned by not releasing details about the payloads.  It's just 
difficult to surmise what that would be, given that the attackers know 
the cat is out of the bag and can probably guess that people are 
privately inspecting the payloads they are sending out.

-Jonathan

>
> the only exception might be a heartbleed type bug, where the
> vulnerability is pervasive and the risk high. (this is also the type
> of bug least likely to be used capriciously against targets. you would
> need to be high value to get a high value exploit your direction.)
>
>
>
>> Get bugs fixed and get developers to write
>> fewer bugs going forward, and we'll rapidly deplete the pool of 0day and
>> drive up the cost of FOXACID style deployments.
> i was young once, too.  ;)
>
> in all seriousness, we don't know how to build systems resistant to
> the attacks of collaborating nation states.  you can dodge some
> attacks, some of the time, at best.
> research continues...
>
>
>
>> Forcing deployments to move to more interesting bugs will also give
>> insight into IAs' exploit sourcing methodologies.
> this is absolutely true and useful,
>   and does not require making specific exploits public.
>
>
>
>> Hasn't someone already created an open "FOXACID observatory" tracking
>> potential deployments of this and similar APT exploit deployments?
> various security companies or research groups drop docs for publicity
> or other self interest.  the work some NGOs are doing to protect human
> rights workers in foreign lands may touch on it now and then. (burning
> HackingTeam's tech, for example)
>
>
>
>>> it is more useful to mitigate privately, and observe how/when an
>>> exploit is used, than burn it publicly for zero effective security
>>> improvement.
>> That seems unlikely to be correct even in the medium term.
> let's have a chat at a conference over strong drinks and plausible
> deniability...
>
>
>
> to recap:
>
> - if you want to thwart FOXACID type attacks there are ways to do it
> without knowing specific payloads. (architectural and broad
> techniques, not fingerprints on binaries or call graphs)
>
> - burning specific payloads does nothing to protect against the
> threat, as they are trivially replaced. (this is a continuous process
> with envious resources)
>
> - all options for disseminating caught exploits involve trust in third
> parties which you may not give. (sorry, i'm a skeptic by nature and
> nurture :)
>
> - private use for signalling/discovery (see honey tokens, etc.) is
> useful as a testing / integrity check in some ways, given no other
> more useful options.

-- 
Liberationtech is public & archives are searchable on Google. Violations of list guidelines will get you moderated: https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, change to digest, or change password by emailing moderator at companys at stanford.edu.




More information about the cypherpunks mailing list