Leaked NSA documents indicate it uses Google's advertising cookies to track targets for offensive hacking.

Mathew J. Schwartz, Contributor

December 11, 2013

5 Min Read
Image credit: Flickr user <a href="http://www.flickr.com/photos/ssoosay/5762345557/" target="_blank">ssoosay</a>.</span>

Leaked National Security Agency slides reveal that the intelligence agency has been using Google's tracking cookies to identify targets for offensive NSA hacking operations.

So said a Washington Post report, published Tuesday, which is based on documents leaked by NSA whistleblower Edward Snowden.

According to the documents, both the NSA and its British counterpart, GHCQ, have been using cookies -- which are tracking files placed on users' systems by websites and advertising networks -- to help them track web users that they've previously seen. "The intelligence agencies have found particular use for a part of a Google-specific tracking mechanism known as the 'PREF' cookie," according to the Post report. "These cookies typically don't contain personal information, such as someone's name or e-mail address, but they do contain numeric codes that enable websites to uniquely identify a person's browser."

[Do you always feel like someone is watching you? See 6 Tips To Secure Webcams, Stop Keyloggers.]

According to the NSA documents, these cookies are used to "enable remote exploitation" of targets. While those remote-exploitation techniques aren't detailed, previously documented NSA slides have suggested that the agency has compromised 50,000 PCs to date with malware, installed in part by exploiting known or zero-day vulnerabilities on targeted systems.

For many Internet users, advertising networks were already something of a Faustian bargain: Being tracked by advertisers gives users access to content for which they might otherwise might have to pay. In other words, tracking fuels advertising, which then pays many website operators' salaries.

But the NSA revelations add a new twist: By allowing advertising networks to track them, people are giving any intelligence agency -- including not just the NSA and GCHQ, but theoretically anyone who's able to grab those cookies -- the ability to track their PCs, and thus make it easier to trace data flowing to or from their systems.

"On a macro level, 'we need to track everyone everywhere for advertising' translates into 'the government being able to track everyone everywhere,'" Chris Hoofnagle, a lecturer in residence at the University of California, Berkeley, law school," told the Washington Post. "It's hard to avoid."

One potential fix, however, would be simple: Stop tracking users or employing unique IDs. "The easiest way to protect users against this threat is to refrain from tracking," said Ed Felton, a computer scientist at Princeton University, in a blog post. "But if sites are going to track users, this can be done in ways that avoid surveillance."

For example, sites could switch to using HTTPS for all cookie-related activities. "This ensures that the unique ID that is transmitted is protected by encryption in a way that doesn't leak to an eavesdropper any information about which connections are to the same user," Felton said. "Implementing HTTPS on a larger site is not as easy as it should be, but it seems to be the price of surveillance-proof tracking."

Another fix would be to store all unique identifiers on the client, rather than continually broadcasting them to advertisers' tracking servers. Then required information could be disclosed -- in encrypted form -- to advertising networks, but only when a user allows. "This requires more aggressive reengineering of an ad or analytics service, but it provides additional benefits to the user in terms of privacy and transparency," he said.

But what won't work, Felton cautioned, is attempting to encrypt current tracking cookies, since every encrypted cookie could still serve as a unique identifier. Likewise, switching to an alternate tracking mechanism, such as browser fingerprinting techniques, would still enable the NSA to grab unique IDs for systems.

clappercookie.png

Finally, the Do Not Track (DNT) standard might help users block some types of cookies that the NSA can grab, as it would require advertisers to respect any browser that had a DNT flag enabled. To date, however, the online advertising industry has managed to scuttle this effort. Furthermore, while DNT would block third-party tracking cookies, it wouldn't prohibit sites such as Google from using cookies for its customers or users.

Just how is the NSA obtaining Google cookies? That's not clear, although according to the leaked documents, the NSA's Special Source Operations (SSO) division has been obtaining the cookies. That division is responsible both for tapping Internet backbones, as well as working with businesses to obtain information of interest to the agency. One of the SSO's tools is a Foreign Intelligence Surveillance Act order, which would legally require a business to comply with the NSA's request, as well as to not reveal that request.

However this information is obtained, according to a leaked slide, the SSO then shares "logins, cookies, and GooglePREFID" with the agency's Tailored Access Operations, which handles offensive hacking operations. Some information is also shared with Britain's GCHQ intelligence agency.

The NSA has declined to respond to these cookie-grabbing revelations. "As we've said before, NSA, within its lawful mission to collect foreign intelligence to protect the United States, uses intelligence tools to understand the intent of foreign adversaries and prevent them from bringing harm to innocent Americans," according to a statement released by the agency.

But this isn't the first sign that the NSA is interested in third-party cookies. A leaked NSA presentation -- titled "Tor Stinks" (published by the Guardian in October) -- revealed that the agency was using cookies for Google's DoubleClick.net third-party advertising service to help it identify users who were tapping Tor in an attempt to anonymize their communications and web browsing habits.

There's no such thing as perfection when it comes to software applications, but organizations should make every effort to ensure that their developers do everything in their power to get as close as possible. This Dark Reading report, Integrating Vulnerability Management Into The Application Development Process, examines the challenges of finding and remediating bugs in applications that are growing in complexity and number, and recommends tools and best practices for weaving vulnerability management into the development process from the very beginning. (Free registration required.)

Mathew Schwartz is a freelance writer, editor, and photographer, as well the InformationWeek information security reporter.

About the Author(s)

Mathew J. Schwartz

Contributor

Mathew Schwartz served as the InformationWeek information security reporter from 2010 until mid-2014.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights