Black ops: how HBGary wrote backdoors for the government

Eugen Leitl eugen at leitl.org
Sat Feb 19 11:02:21 PST 2011


http://arstechnica.com/tech-policy/news/2011/02/black-ops-how-hbgary-wrote-backdoors-and-rootkits-for-the-government.ars

Black ops: how HBGary wrote backdoors for the government

By Nate Anderson | Last updated about 19 hours ago

On November 16, 2009, Greg Hoglund, a cofounder of computer security firm
HBGary, sent an e-mail to two colleagues. The message came with an
attachment, a Microsoft Word file called AL_QAEDA.doc, which had been further
compressed and password protected for safety. Its contents were dangerous.

The attached document, which is in English, begins: "LESSON SIXTEEN:
ASSASSINATIONS USING POISONS AND COLD STEEL (UK/BM-154 TRANSLATION)."

It purports to be an Al-Qaeda document on dispatching one's enemies with
knives (try "the area directly above the genitals"), with ropes ("Chokingb&
there is no other area besides the neck"), with blunt objects ("Top of the
stomach, with the end of the stick."), and with hands ("Poking the fingers
into one or both eyes and gouging them.").

But the poison recipes, for ricin and other assorted horrific bioweapons, are
the main draw. One, purposefully made from a specific combination of spoiled
food, requires "about two spoonfuls of fresh excrement." The document praises
the effectiveness of the resulting poison: "During the time of the destroyer,
Jamal Abdul Nasser, someone who was being severely tortured in prison (he had
no connection with Islam), ate some feces after losing sanity from the
severity of the torture. A few hours after he ate the feces, he was found
dead."

The purported Al-Qaeda document

According to Hoglund, the recipes came with a side dish, a specially crafted
piece of malware meant to infect Al-Qaeda computers. Is the US government in
the position of deploying the hacker's darkest toolsbrootkits, computer
viruses, trojan horses, and the like? Of course it is, and Hoglund was
well-positioned to know just how common the practice had become. Indeed, he
and his company helped to develop these electronic weapons.

Thanks to a cache of HBGary e-mails leaked by the hacker collective
Anonymous, we have at least a small glimpse through a dirty window into the
process by which tax dollars enter the military-industrial complex and emerge
as malware.

Task B

In 2009, HBGary had partnered with the Advanced Information Systems group of
defense contractor General Dynamics to work on a project euphemistically
known as "Task B." The team had a simple mission: slip a piece of stealth
software onto a target laptop without the owner's knowledge.

HBGary white paper on exploiting software

They focused on portsba laptop's interfaces to the world around itbincluding
the familiar USB port, the less-common PCMCIA Type II card slot, the smaller
ExpressCard slot, WiFi, and Firewire. No laptop would have all of these, but
most recent machines would have at least two.

The HBGary engineering team broke this list down into three categories. First
came the "direct access" ports that provided "uninhibited electronic direct
memory access." PCMCIA, ExpressCard, and Firewire all allowed external
devicesbsay a custom piece of hardware delivered by a field operativebto
interact directly with the laptop with a minimum amount of fuss. The direct
memory access provided by the controllers for these ports mean that devices
in them can write directly to the computer's memory without intervention from
the main CPU and with little restriction from the operating system. If you
want to overwrite key parts of the operating system to sneak in a bit of your
own code, this is the easiest way to go.

The second and third categories, ports that needed "trust relationships" or
relied on "buffer overflows," included USB and wireless networking. These
required more work to access, especially if one wanted to do so without
alerting a user; Windows in particular is notorious for the number of prompts
it throws when USB devices are inserted or removed. A cheerful note about
"Searching for device driver for NSA_keylogger_rootkit_tango" had to be
avoided.

So HBGary wanted to go the direct access route, characterizing it as the "low
hanging fruit" with the lowest risk. General Dynamics wanted HBGary to
investigate the USB route as well (the ports are more common, but an attack
has to trick the operating system into doing its bidding somehow, commonly
through a buffer overflow).

The team had two spy movie scenarios in which its work might be used,
scenarios drafted to help the team think through its approach:

1) Man leaves laptop locked while quickly going to the bathroom. A device can
then be inserted and then removed without touching the laptop itself except
at the target port. (i.e. one can't touch the mouse, keyboard, insert a CD,
etc.)

2) Woman shuts down her laptop and goes home. One then can insert a device
into the target port and assume she will not see it when she returns the next
day. One can then remove the device at a later time after she boots up the
machine. 

Why would the unnamed client for Task Bbwhich a later e-mail makes clear was
for a government agencybwant such a tool? Imagine you want access to the
computer network used in a foreign government ministry, or in a nuclear lab.
Such a facility can be tough to crack over the Internet; indeed, the most
secure facilities would have no such external access. And getting an agent
inside the facility to work mischief is very riskybif it's even possible at
all.

But say a scientist from the facility uses a memory stick to carry data home
at night, and that he plugs the memory stick into his laptop on occasion. You
can now get a piece of custom spyware into the facility by putting a copy on
the memory stickbif you can first get access to the laptop. So you tail the
scientist and follow him from his home one day to a local coffee shop. He
steps away to order another drink, to go to the bathroom, or to talk on his
cell phone, and the tail walks past his table and sticks an
all-but-undetectable bit of hardware in his laptop's ExpressCard slot.
Suddenly, you have a vector that points all the way from a local coffee shop
to the interior of a secure government facility.

The software exploit code actually delivered onto the laptop was not HBGary's
concern; it needed only to provide a route through the computer's front door.
But it had some constraints. First, the laptop owner should still be able to
use the port so as not to draw attention to the inserted hardware. This is
quite obviously tricky, but one could imagine a tiny ExpressCard device that
slid down into the slot but could in turn accept another ExpressCard device
on its exterior-facing side. This sort of parallel plugging might well go
unnoticed by a user with no reason to suspect it.

HBGary's computer infiltration code then had to avoid the computer's own
electronic defenses. The code should "not be detectable" by virus scanners or
operating system port scans, and it should clean up after itself to eliminate
all traces of entry.

Greg Hoglund was confident that he could deliver at least two laptop-access
techniques in less than a kilobyte of memory each. As the author of books
like Exploiting Software: How to Break Code, Rootkits: Subverting the Windows
Kernel, and Exploiting Online Games: Cheating Massively Distributed Systems,
he knew his way around the deepest recesses of Windows in particular.

Hoglund's special interest was in all-but-undetectable computer "rootkits,"
programs that provide privileged access to a computer's innermost workings
while cloaking themselves even from standard operating system functions. A
good rootkit can be almost impossible to remove from a running machinebif you
could even find it in the first place.

Just a demo

Some of this work was clearly for demonstration purposes, and much of it was
probably never deployed in the field. For instance, HBGary began $50,000 of
work for General Dynamics on "Task C" in June 2009, creating a piece of
malware that infiltrated Windows machines running Microsoft Outlook.

The target user would preview a specially crafted e-mail message in Outlook
that took advantage of an Outlook preview pane vulnerability to execute a bit
of code in the background. This code would install a kernel driver, one
operating at the lowest and most trusted level of the operating system, that
could send traffic over the computer's serial port. (The point of this
exercise was never spelled out, though the use of serial ports rather than
network ports suggest that cutting-edge desktop PCs were not the target.)


HBGary's expertise

Once installed, the malware could execute external commands, such as sending
specific files over the serial port, deleting files on the machine, or
causing the infamous Windows "blue screen of death." In addition, the code
should be able to pop open the computer's CD tray and blink the lights on its
attached keyboardsbanother reminder that Task C was, at this stage, merely
for a demo.

General Dynamics would presumably try to interest customers in the product,
but it's not clear from the e-mails at HBGary whether this was ever
successful. Even with unique access to the innermost workings of a security
firm, much remains opaque; the real conversations took place face-to-face or
on secure phone lines, not through e-mail, so the glimpses we have here are
fragmentary at best. This care taken to avoid sending sensitive information
via unencrypted e-mail stands in stark contrast with the careless approach to
security that enabled the hacks in the first place.

But that doesn't mean specific information is hard to come bybsuch as the
fact that rootkits can be purchased for $60,000.

Step right up!

Other tools were in use and were sought out by government agencies. An
internal HBGary e-mail from early 2010 asks, "What are the license costs for
HBGary rk [rootkit] platform if they want to use it on guardian for afisr
[Air Force Intelligence, Surveillance, and Reconnaissance]?"

The reply indicates that HBGary has several tools on offer. "Are you asking
about the rootkit for XP (kernel driver that hides in plain sight and is a
keylogger that exfiltrates data) or are you asking about 12 Monkeys? We've
sold licenses of the 1st one for $60k. We haven't set a price on 12 Monkeys,
but can."

The company had been developing rootkits for years. Indeed, it had even
developed a private Microsoft Word document outlining its basic rootkit
features, features which customers could have (confirming the e-mail listed
above) for $60,000.


Description of the basic rootkit platform

That money bought you the rootkit source code, which was undetectable by most
rootkit scanners or firewall products when it was tested against them in
2008. Only one product from Trend Micro noticed the rootkit installation, and
even that alert was probably not enough to warn a user. As the HBGary rootkit
document notes, "This was a low level alert. TrendMicro assaults the user
with so many of these alerts in every day use, therefore most users will
quickly learn to ignore or even turn off such alerts."

When installed in a target machine, the rootkit could record every keystroke
that a user typed, linking it up to a Web browser history. This made it easy
to see usernames, passwords, and other data being entered into websites; all
of this information could be silently "exfiltrated" right through even the
pickiest personal firewall.

But if a target watched its outgoing traffic and noted repeated contacts
with, say, a US Air Force server, suspicions might be aroused. The rootkit
could therefore connect instead to a "dead drop"ba totally anonymous server
with no apparent connection to the agency using the rootkitbwhere the
target's keyboard activity could be retrieved at a later time.

But by 2009, the existing generic HBGary rootkit package was a bit long in
the tooth. Hoglund, the rootkit expert, apparently had much bigger plans for
a next-gen product called "12 monkeys."

12 Monkeys

The 12 Monkeys rootkit was also a contract paid out by General Dynamics; as
one HBGary e-mail noted, the development work could interfere with Task B,
but "if we succeed, we stand to make a great deal of profit on this."

On April 14, 2009, Hoglund outlined his plans for the new super-rootkit for
Windows XP, which was "unique in that the rootkit is not associated with any
identifiable or enumerable object. This rootkit has no file, named data
structure, device driver, process, thread, or module associated with it."

How could Hoglund make such a claim? Security tools generally work by
scanning a computer for particular objectsbpieces of data that the operating
system uses to keep track of processes, threads, network connections, and so
on. 12 Monkeys simply had nothing to find. "Since no object is associated
with the objectless rootkit, detection will be very difficult for a security
scanner," he wrote. In addition, the rootkit would encrypt itself to cloak
itself further, and hop around in the computer's memory to make it even
harder to find.

As for getting the data off a target machine and back to the rootkit's buyer,
Hoglund had a clever idea: he disguised the outgoing traffic by sending it
only when other outbound Web traffic was being sent. Whenever a user sat down
at a compromised machine and started surfing the Web, their machine would
slip in some extra outgoing data "disguised as ad-clicks" that would contain
a log of all their keystrokes.

While the basic rootkit went for $60,000, HBGary hoped to sell 12 Monkeys for
much more: "around $240k."

0-day

The goal of this sort of work is always to create something undetectable, and
there's no better way to be undetectable than by taking advantage of a
security hole that no one else has ever found. Once vulnerabilities are
disclosed, vendors like Microsoft race to patch them, and they increasingly
push those patches to customers via the Internet. Among hackers, then, the
most prized exploits are "0-day" exploitsbexploits for holes for which no
patch yet exists.

HBGary kept a stockpile of 0-day exploits. A slide from one of the company's
internal presentations showed that the company had 0-day exploits for which
no patch yet existedbbut these 0-day exploits had not yet even been
published. No one knew about them.

The company had exploits "on the shelf" for Windows 2000, Flash, Java, and
more; because they were 0-day attacks, any computer around the world running
these pieces of software could be infiltrated.

One of the unpublished Windows 2000 exploits, for instance, can deliver a
"payload" of any size onto the target machine using a heap exploit. "The
payload has virtually no restrictions" on what it can do, a document notes,
because the exploit secures SYSTEM level access to the operating system, "the
highest user-mode operating system defined level" available.

These exploits were sold to customers. One email, with the subject "Juicy
Fruit," contains the following list of software:

VMware ESX and ESXi *

Win2K3 Terminal Services

Win2K3 MSRPC

Solaris 10 RPC

Adobe Flash *

Sun Java *

Win2k Professional & Server

XRK Rootkit and Keylogger *

Rootkit 2009 *

The e-mail talks only about "tools," not about 0-day exploits, though that
appears to be what was at issue; the list of software here matches HBGary's
own list of its 0-day exploits. And the asterisk beside some of the names
"means the tool has been sold to another customer on a non-exclusive basis
and can be sold again."


HBGary's 0-day exploits

References to Juicy Fruit abound in the leaked e-mails. My colleague Peter
Bright and I have spent days poring through the tens of thousands of
messages; we believe that "Juicy Fruit" is a generic name for a usable 0-day
exploit, and that interest in this Juicy Fruit was high.

"[Name] is interested in the Juicy Fruit you told him about yesterday," one
e-mail reads. "Next step is I need to give [name] a write up describing it."
That writeup includes the target software, the level of access gained, the
max payload size, and "what does the victim see or experience."

Aaron Barr, who in late 2009 was brought on board to launch the separate
company HBGary Federal (and who provoked this entire incident by trying to
unmask Anonymous), wrote in one e-mail, "We need to provide info on 12
monkeys and related JF [Juicy Fruit] asap," apparently in reference to
exploits that could be used to infect a system with 12 Monkeys.

HBGary also provided some Juicy Fruit to Xetron, a unit of the massive
defense contractor Northrop Grumman that specialized in, among other things,
"computer assault." Barr wanted to "provide Xetron with some JF code to be
used for demonstrations to their end customers," one e-mail noted. "Those
demonstrations could lead to JF sales or ongoing services work. There is
significant revenue potential doing testing of JF code acquired elsewhere or
adding features for mission specific uses."

As the deal was being worked out, HBGary worked up an agreement to "provide
object code and source code for this specific Juicy Fruit" to Xetron, though
they could not sell the code without paying HBGary. The code included with
this agreement was a "Adobe Macromedia Flash Player Remote Access Tool," the
"HBGary Rootkit Keylogger Platform," and a "Software Integration Toolkit
Module."

The question of who might be interested in these tools largely remains an
unknownbthough Barr did request information on HBGary's Juicy Fruit just
after asking for contacts at SOCOM, the US Special Operations Command.

But HBGary Federal had ideas that went far beyond government rootkits and
encompassed all facets of information warfare. Including, naturally,
cartoons. And Second Life.

Psyops

In mid-2010, HBGary Federal put together a PSYOP (psychological operations)
proposal for SOCOM, which had issued a general call for new tools and
techniques. In the document, the new HBGary Federal team talked up their past
experience as creators of "multiple products briefed to POTUS [President of
the United States], the NSC [National Security Council], and Congressional
Intelligence committees, as well as senior intelligence and military
leaders."

The document focused on cartoons and the Second Life virtual world. "HBGary
personnel have experience creating political cartoons that leverage current
events to seize the target audience's attention and propagate the desired
messages and themes," said the document, noting that security-cleared
cartoonists and 3D modelers had already been lined up to do the work if the
government wanted some help.


Cartoon example of Ahmadinejad with a puppet ayatollah 

The cartooning process "starts with gathering customer requirements such as
the target audience, high level messages and themes, intended publication
mediumsb& Through brainstorming sessions, we develop concept ideas. Approved
concepts are rough sketched in pencil. Approved sketches are developed into a
detailed, color end product that is suitable for publishing in a variety of
mediums."

A sample cartoon, of Iranian President Ahmadinejad manipulating a puppet
Ayatollah, was helpfully included.

The document then went on to explain how the US government could use a
virtual world such as Second Life to propagate specific messages. HBGary
could localize the Second Life client, translating its menu options and
keyboard shortcuts into local dialects, and this localized client could
report "valuable usage metrics, enabling detailed measures of effects." If
you want to know whether your message is getting out, just look at the
statistics of how many people play the game and for how long.

As for the messages themselves, those would appear within the Second Life
world. "HBGary can develop an in-world advertising company, securing small
plots of virtual land in attractive locations, which can be used to promote
themes using billboards, autonomous virtual robots, audio, video, and 3D
presentations," said the document.

They could even make a little money while they're at it, by creating
"original marketable products to generate self-sustaining revenue within the
virtual space as well as promote targeted messaging."

We found no evidence that SOCOM adopted the proposal.

But HBGary Federal's real interest had become social media like Facebook and
Twitterband how they could be used to explore and then penetrate secretive
networks. And that was exactly what the Air Force wanted to do.

Fake Facebook friends


In June 2010, the government was expressing real interest in social networks.
The Air Force issued a public request for "persona management software,"
which might sound boring until you realize that the government essentially
wanted the ability to have one agent run multiple social media accounts at
once.  It wanted 50 software licenses, each of which could support 10
personas, "replete with background, history, supporting details, and cyber
presences that are technically, culturally and geographically consistent."

The software would allow these 50 cyberwarriors to peer at their monitors all
day and manipulate these 10 accounts easily, all "without fear of being
discovered by sophisticated adversaries." The personas would appear to come
from all over the world, the better to infiltrate jihadist websites and
social networks, or perhaps to show up on Facebook groups and influence
public opinion in pro-US directions.

As the cyberwarriors worked away controlling their 10 personas, their
computers would helpfully provide "real-time local information" so that they
could play their roles convincingly.

In addition the Air Force wanted a secure virtual private network that could
mask the IP addresses behind all of this persona traffic. Every day, each
user would get a random IP address to help hide "the existence of the
operation." The network would further mask this persona work by "traffic
mixing, blending the user's traffic with traffic from multitudes of users
from outside the organization. This traffic blending provides excellent cover
and powerful deniability."

This sort of work most interested HBGary Federal's Aaron Barr, who was
carving out a niche for himself as a social media expert. Throughout late
2010 and early 2011, he spent large chunks of his time attempting to use
Facebook, Twitter, and Internet chat to map the network of Exelon nuclear
plant workers in the US and to research the members of Anonymous. As money
for his company dried up and government contracts proved hard to come by,
Barr turned his social media ideas on pro-union forces, getting involved in a
now-controversial project with two other security firms.

But e-mails make clear that he mostly wanted to sell this sort of capability
to the government. "We have other customers, mostly on offense, that are
interested in Social Media for other things," he wrote in August 2010. "The
social media stuff seems like low hanging fruit."

How does one use social media and fake "personas" to do anything of value? An
e-mail from Barr on August 22 makes his thinking clear. Barr ponders "the
best way to go about establishing a persona to reach an objective (in this
case ft. belvoir/INSCOM/1st IO)."

The Army's Fort Belvoir, like any secretive institution, might be more easily
penetrated by pretending to be an old friend of a current employee. "Make
your profile swim in a large sea," Barr wrote. "Pick a big city, big high
school, big company. Work your way up and in. Recreate your history. Start by
friending high school people. In my case I am in the army so after you have
amassed enough friends from high school, then start friending military folks
outside of your location, something that matches the area your in, bootcamp,
etc. Lastly start to friend people from the base, but start low and work your
way up. So far so good."

Once the persona had this network of friends, "I will start doing things
tricky. Try to manipulate conversations, insert communication streams, etc,"
said Barr. This sort of social media targeting could also be used to send
your new "friend" documents or files (such as the Al-Qaeda poison document
discussed above) [that] come complete with malware, or by directing them to
specially-crafted websites designed to elicit some specific piece of
information: directed attacks known as "spear phishing."

But concerns arose about obtaining and using social media data, in part
because sites like Facebook restricted the "scraping" of its user data. An
employee from the link analysis firm Palantir wrote Barr at the end of
August, asking, "Is the idea that we'd want to ingest all of Facebook's data,
or just a targeted subset for a few users of interest?"

The more data that was grabbed from Facebook, the more chance a problem could
arise. The Palantir employee noted that a researcher had used similar tools
to violate Facebook's acceptable use policy on data scraping, "resulting in a
lawsuit when he crawled most of Facebook's social graph to build some
statistics. I'd be worried about doing the same. (I'd ask him for his
Facebook databhe's a fan of Palantirbbut he's already deleted it.)"

Still, the potential usefulness of sites like Facebook was just too powerful
to ignore, acceptable use policy or not.

Feeling twitchy

While Barr fell increasingly in love with his social media sleuthing, Hoglund
still liked researching his rootkits. In September, the two teamed up for a
proposal to DARPA, the Defense Advanced Research Projects Agency that had
been instrumental in creating the Internet back in the 1960s.

DARPA didn't want incrementalism. It wanted breakthroughs (one of its most
recent projects is the "100-Year Starship Study"), and Barr and Hoglund
teamed up for a proposal to help the agency on its Cyber Insider Threat
(CINDER) program. CINDER was an expensive effort to find new ways to watch
employees with access to sensitive information and root out double agents or
disgruntled workers who might leak classified information.

So Barr and Hoglund drafted a plan to create something like a lie detector,
except that it would look for signs of "paranoia" instead.

"Like a lie detector detects physical changes in the body based on
sensitivities to specific questions, we believe there are physical changes in
the body that are represented in observable behavioral changes when
committing actions someone knows is wrong," said the proposal. "Our solution
is to develop a paranoia-meter to measure these observables."

The idea was to take an HBGary rootkit like 12 Monkeys and install it on user
machines in such a way that users could not remove it and might not even be
aware of its presence. The rootkit would log user keystrokes, of course, but
it would also take "as many behavioral measurements as possible" in order to
look for suspicious activity that might indicate wrongdoing.

What sort of measurements? The rootkit would monitor "keystrokes, mouse
movements, and visual cues through the system camera. We believe that during
particularly risky activities we will see more erratic mouse movements and
keystrokes as well as physical observations such as surveying surroundings,
shifting more frequently, etc."

The rootkit would also keep an eye on what files were being accessed, what
e-mails were being written, and what instant messages were being sent. If
necessary, the software could record a video of the user's computer screen
activity and send all this information to a central monitoring office. There,
software would try to pick out employees exhibiting signs of paranoia, who
could then be scrutinized more closely.

Huge and obvious challenges presented themselves. As the proposal noted:

Detecting insider threat actions is highly challenging and will require a
sophisticated monitoring, baselining, analysis, and alerting capability.
Human actions and organizational operations are complex. You might think you
can just look for people that are trying to gain access to information
outside of their program area of expertise. Yet there are legitimate reasons
for accessing this information. In many cases the activity you might call
suspicious can also be legitimate. Some people are more or less inquisitive
and will have different levels of activity in accessing information outside
their specific organization. Some of the behaviors on systems vary widely
depending on function. Software developer behavior will be very different
than an HR person or senior manager. All of these factors need to be taken
into account when developing detection capabilities for suspicious activity.
We cannot focus on just [whether] a particular action is potentially
suspicious. Instead we must quantify the legitimate reasons for the activity
and whether this person has a baseline, position, attributes, and history to
support the activity.

DARPA did not apparently choose to fund the plan.

Grey areas

The ideas got ever more grandiose. Analyzing malware, HBGary's main focus,
wasn't enough to keep up with the hackers; Hoglund had a plan to get a leg up
on the competition by getting even closer to malware authors. He floated an
idea to sniff Russian GSM cell phone signals in order to eavesdrop on
hackers' voice calls and text messages.

"GSM is easily sniffed," he wrote to Barr. "There is a SHIELD system for this
that not only intercepts GSM 5.1 but can also track the exact physical
location of a phone. Just to see what's on the market, check [redacted]b&
these have to be purchased overseas obviously."

The note concluded: "Home alone on Sunday, so I just sit here and sharpen the
knife."

Barr, always enthusiastic for these kinds of ideas, loved this one. He wanted
to map out everything that would be required for such an operation, including
"personas, sink holes, honey nets, soft and hard assetsb& We would want at
least one burn persona. We would want to sketch out a script to meet specific
objectives.

And, he noted, "We will likely ride in some grey areas."

Back to basics

In January 2011, Barr had moved on to his research into Anonymousbresearch
that would eventually do his company in. Over at HBGary, Hoglund continued
his pursuit of next-gen rootkits. He had hit on a new approach that he called
"Magenta."

This would be a "new breed of Windows-based rootkit," said a Magenta planning
document, one that HBGary called a "multi-context rootkit."


Slava Markeyev

The Magenta software would be written in low-level assembly language, one
step up from the ones and zeroes of the binary code with which computers do
their calculating. It would inject itself into the Windows kernel, and then
inject itself further into an active process; only from there would the main
body of the rootkit execute.

Magenta would also inject itself routinely into different processes, jumping
around inside the computer's memory to avoid detection. Its
command-and-control instructions, telling the rootkit exactly what to do and
where to send the information, wouldn't come from some remote Internet server
but from the host computer's own memorybwhere the control instructions had
been separately injected.

"This is ideal because itbs trivial to remotely seed C&C messages into any
networked Windows host," noted Hoglund, "even if the host in question has
full Windows firewalling enabled."

Nothing like Magenta existed (not publicly, at least), and Hoglund was sure
that he could squeeze the rootkit code into less than 4KB of memory and make
it "almost impossible to remove from a live running system." Once running,
all of the Magenta files on disk could be deleted. Even the best anti-rootkit
tools, those that monitored physical memory for signs of such activity,
"would only be of limited use since by the time the responder tried to verify
his results Magenta will have already moved to a new location & context."

Hoglund wanted to build Magenta in two parts: first, a prototype for Windows
XP with Service Pack 3ban old operating system but still widely installed.
Second, if the prototype generated interest, HBGary could port the rootkit
"to all current flavors of Microsoft Windows."

Shortly thereafter, Anonymous broke into HBGary Federal's website, cracked
Barr's hashed password using rainbow tables, and found themselves in a
curious position; Barr was also the administrator for the entire e-mail
system, so they were able to grab e-mail from multiple accounts, including
Hoglund's.

A world awash in rootkits

The leaked e-mails provide a tantalizing glimpse of life behind the security
curtain. HBGary and HBGary Federal were small players in this space; indeed,
HBGary appears to have made much of its cash with more traditional projects,
like selling anti-malware defense tools to corporations and scanning their
networks for signs of infection.

If rootkits, paranoia monitors, cartoons, and fake Facebook personas were
being proposed and developed here, one can only imagine the sorts of
classified projects underway throughout the entire defense and security
industry.

Whether these programs are good or bad depends upon how they are used. Just
as Hoglund's rootkit expertise meant that he could both detect them and
author them, 0-day exploits and rootkits in government hands can be turned to
many uses. The FBI has had malware like CIPAV (the Computer and Internet
Protocol Address Verifier) for several years, and it's clear from the HBGary
e-mail leak that the military is in wide possession of rootkits and other
malware of its own. The Stuxnet virus widely believed to have at least
damaged Iranian nuclear centrifuge operations is thought to have originated
in the US or Israeli governments, for instance.

But the e-mails also remind us how much of this work is carried out privately
and beyond the control of government agencies. We found no evidence that
HBGary sold malware to nongovernment entities intent on hacking, though the
company did have plans to repurpose its DARPA rootkit idea for corporate
surveillance work. ("HBGary plans to transition technology into commercial
products," it told DARPA.)

And another document, listing HBGary's work over the last few years, included
this entry: "HBGary had multiple contracts with a consumer software company
to add stealth capability to their host agent."

The actions of HBGary Federal's Aaron Barr also serve as a good reminder
that, when they're searching for work, private security companies are more
than happy to switch from military to corporate clientsband they bring some
of the same tools to bear.

When asked to investigate pro-union websites and WikiLeaks, Barr turned
immediately to his social media toolkit and was ready to deploy personas,
Facebook scraping, link analysis, and fake websites; he also suggested
computer attacks on WikiLeaks infrastructure and pressure be brought upon
journalists like Glenn Greenwald.

His compatriots at Palantir and Berico showed, in their many e-mails, few if
any qualms about turning their national security techniques upon private
dissenting voices. Barr's ideas showed up in Palantir-branded PowerPoints and
Berico-branded "scope of work" documents. "Reconnaissance cells" were
proposed, network attacks were acceptable, "target dossiers" on "adversaries"
would be compiled, and "complex information campaigns" involving fake
personas were on the table.

Critics like Glenn Greenwald contend that this nexus of private and public
security power is a dangerous mix. "The real issue highlighted by this
episode is just how lawless and unrestrained is the unified axis of
government and corporate power," he wrote last week.

Especially (though by no means only) in the worlds of the Surveillance and
National Security State, the powers of the state have become largely
privatized. There is very little separation between government power and
corporate power. Those who wield the latter intrinsically wield the former.

The revolving door between the highest levels of government and corporate
offices rotates so fast and continuously that it has basically flown off its
track and no longer provides even the minimal barrier it once did. It's not
merely that corporate power is unrestrained; it's worse than that:
corporations actively exploit the power of the state to further entrench and
enhance their power.

Even if you don't share this view, the e-mails provide a fascinating glimpse
into the origins of government-controlled malware. Given the number of
rootkits apparently being developed for government use, one wonders just how
many machines around the globe could respond to orders from the US military.
Or the Chinese military. Or the Russian military.

While hackers get most of the attention for their rootkits and botnets and
malware, state actors use the same tools to play a different gamebthe Great
Gameband it could be coming soon to a computer near you.

Opening photo illustration contains elements from Shutterstock.





More information about the cypherpunks-legacy mailing list