Date: 13 Jan 2019 Category : | Author: Graham Penrose

The Threat to Encrypted Communications from the “Ghost” & “Virtual Crocodile Clips”

After seeing the outrage that resulted from a call from security agencies to implement mandated encryption backdoors GCHQ have had a roundtable and come up with what they consider to be a more palatable and reasonable proposal which they are calling “exceptional access”.

The new proposal is based on six “principles”, and they are:

  1. Privacy and security protections are critical to public confidence. Therefore, we will only seek exceptional access to data where there’s a legitimate need, that access is the least intrusive way of proceeding and there is appropriate legal authorisation.
  2. Investigative tradecraft has to evolve with technology.
  3. Even when we have a legitimate need, we can’t expect 100 percent access 100 percent of the time.
  4. Targeted exceptional access capabilities should not give governments unfettered access to user data.
  5. Any exceptional access solution should not fundamentally change the trust relationship between a service provider and its users.
  6. Transparency is essential.

The authors of these proposals are Ian Levy the technical director of the National Cyber Security Centre, a part of GCHQ and Crispin Robinson the technical director for cryptanalysis at GCHQ.

Judge Them on their Track Record

What is difficult to accept from these people is their proposals versus their track record and the contradiction in the statement “Transparency is essential”. Never mind that contradiction, the very essence of the Investigatory Powers Act in the UK and the FISA / FISC apparatus in the USA makes transparency, as they call it, impossible to audit.

Also, do we trust the providers of E2E products? Has everyone forgotten how corporations complied silently and in many cases willingly and without coercion with the NSA PRISM program? And for those of you who are not aware also note that the raw data from the PRISM data collection effort was sent by the NSA to GCHQ for processing under the XKeyscore program because UK law was more forgiving regarding privacy invasions and mass surveillance than US law and that was BEFORE the Investigatory Powers Act of 2016.

For those who need a refresher on the list of collabortators with the largest single global mass surveillance program ever exposed, they included:

All did so COVERTLY until they were exposed by Snowden. That is not to mention the telecommuications companies and carriers who also covertly assisted the mass surveillance efforts of the NSA and their GCHQ partners.

How E2E Works

In order to understand the implications of what Levy and Robinson are proposing it is important to understand the way E2E conversations work. In simple terms E2E works as follows:

In any E2E system each party encrypts messages or audio or video data in a protected tunnel from one device to the other. In doing so the users do not have to trust a third party infrastructure whether that is a telcommunications provider, or the physical telephone lines, or routing servers, all the way to the undersea cables that make up the backbone of the global communications infrastructure.What do users do have to trust is that the app providers have not embedded a backdoor or exposed their communications by retaining hidden access to encryption keys.

Many exchanges of data in the E2E environment are done in the context of one to one communications but of course there is always the option of chatting in groups or conference calling. Each party is still encrypting the data they send to the other participants.

The encryption algorithms and the way keys are handled do vary from implementation to implementation but the basic concept is universal.

“… one of the most challenging problems in encrypted messaging stems is getting the key you need to actually perform the encryption. This problem, which is generally known as key distribution, is an age-old concern in the field of computer security. There are many ways for it to go wrong. In the olden days, we used to ask users to manage and exchange their own keys, and then select which users they wanted to encrypt to. This was terrible and everyone hated it.

Modern E2E systems have become popular largely because they hide all of this detail from their users. This comes at the cost of some extra provider-operated infrastructure. An “identity service”, which is a cluster of servers running in a data center. These servers perform many tasks, but most notably: they act as a directory for looking up the encryption key of the person you’re talking to.

If that service misfires and gives you the wrong key, the best ciphers in the world won’t help you. You’ll just be encrypting to the wrong person. These identity services do more than look up keys. In at least some group messaging systems like WhatsApp and iMessage, they also control the membership of group conversations.

In poorly-designed systems, the server can add and remove users from a group conversation at will, even if none of the participants have requested this. It’s as though you’re having a conversation in a very private room — but the door is unlocked and the building manager controls who can come enter and join you.

Most E2E systems have basic countermeasures against bad behavior by the identity service. For example, client applications will typically alert you when a new user joins your group chat, or when someone adds a new device to your iMessage account. Similarly, both WhatsApp and Signal expose “safety numbers” that allow participants to verify that they received the right cryptographic keys, which offers a check against dishonest providers. But these countermeasures are not perfect, and not every service offers them.” (Extract from Matthew Green, Cryptographer and Professor at Johns Hopkins University)

How GCHQ Proposes to Backdoor E2E in an “Ethical” and “Responsible” Fashion

Essentially what the authors of this proposal on behalf of GCHQ want to do is this:

“They’re talking about adding a “feature” that would require the user’s device to selectively lie about whether it’s even employing end-to-end encryption, or whether it’s leaking the conversation content to a third (secret) party. Is the security code displayed by your device a mathematical representation of the two keys involved, or is it a straight-up lie?

Furthermore, what’s to guarantee that the method used by governments to insert the “ghost” key into a conversation without alerting the users won’t be exploited by bad actors?

Despite the GCHQ authors’ claim, the ghost will require vendors to disable the very features that give our communications systems their security guarantees in a way that fundamentally changes the trust relationship between a service provider and its users.

Software and hardware companies will never be able to convincingly claim that they are being honest about what their applications and tools are doing, and users will have no good reason to believe them if they try. And, as we’ve seen already seen, GCHQ will not be the only agency in the world demanding such extraordinary access to billions of users’ software.

Australia was quick to follow the UK’s lead, and we can expect to see similar demands, from Brazil and the European Union to Russia and China. (Note that this proposal would be unconstitutional were it proposed in the United States, which has strong protections against governments forcing actors to speak or lie on its behalf.)

The “ghost” proposal violates the six “principles” in other ways, too. Instead of asking investigative tradecraft to evolve with technology, it’s asking technology to build investigative tradecraft in from the ground floor. Instead of targeted exceptional access, it’s asking companies to put a dormant wiretap in every single user’s pocket, just waiting to be activated.

We must reject GCHQ’s newest “ghost” proposal for what it is: a mandated encryption backdoor that weakens the security properties of encrypted messaging systems and fundamentally compromises user trust. GCHQ needs to give up the ghost. It’s just another word for an encryption backdoor.” (Extract from Nate Cardozo, Senior Information Security Counsel at EFF)

Related Reading

GCHQ details how law enforcement could be silently injected into communications

Written by Technical Director of the National Cyber Security Centre Ian Levy and Technical Director for Cryptanalysis for GCHQ Crispin Robinson, the essay claims that end-to-end encryption remains, but an extra “end” for law enforcement. “It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call,” the pair said.

GCHQ’s not-so-smart idea to spy on encrypted messaging apps is branded ‘absolute madness’

Law enforcement and intelligence agencies have long wanted access to encrypted communications, but have faced strong opposition to breaking the encryption for fears that it would put everyone’s communications at risk, rather than the terror suspects or criminals that the police primarily want to target. In this case, two people using an end-to-end encrypted messaging app would be joined by a third, invisible person — the government — which could listen in at will.

GCHQ pushes for ‘virtual crocodile clips’ on chat apps – the ability to silently slip into private encrypted comms

They promise to get back to a time where the authorities only use their exceptional powers in limited cases, where a degree of accountability is written into spying programs, and they promise a more open discussion about what spy agencies are allowed to do and how they do it. But the demand for backdoors is still there, this time couched in terms of “virtual crocodile clips” on modern telephone lines, namely the encrypted chat and call apps that have become ubiquitous on smart phones.

What if Responsible Encryption Back-Doors Were Possible?

Let us now posit the existence of a responsible exceptional access technology, one that secures and protects the privacy of data with encryption, but also provides law enforcement authorities with access to that data. “Responsible” here describes a technology that achieves the desired effect of providing designated authorities with controlled access to data without creating undue risks of data being released to unauthorized parties. It should be noted that data breaches are all too frequent today and that complexity is regarded as the enemy of security. Thus, despite the dearth of proposals to provide responsible access and the expert analyses that enumerate reasons why it is likely unattainable, let us assume that such technology is possible. The next step is to consider the consequences of mandating its use. Even if we could build it, the question remains of whether we should build it.

Principles for a More Informed Exceptional Access Debate

Unfortunately, it’s the details that are missing from the discussion around lawful access to commodity end-to-end encrypted services and devices (often called the “going dark” problem). Without details, the problem is debated as a purely academic abstraction concerning security, liberty, and the role of government. There is a better way that doesn’t involve, on one side, various governments, and on the other side lawyers, philosophers, and vendors’ PR departments continuing to shout at each other. If we can get all parties to look at some actual detail, some practices and proposals—without asking anyone to compromise on things they fundamentally believe in—we might get somewhere.

On Ghost Users and Messaging Backdoors

The past few years have been an amazing time for the deployment of encryption. In ten years, encrypted web connections have gone from a novelty into a requirement for running a modern website. Smartphone manufacturers deployed default storage encryption to billions of phones. End-to-end encrypted messaging and phone calls are now deployed to billions of users. While this progress is exciting to cryptographers and privacy advocates, not everyone sees it this way. A few countries, like the U.K. and Australia, have passed laws in an attempt to gain access to this data, and at least one U.S. proposal has made it to Congress. The Department of Justice recently added its own branding to the mix, asking tech companies to deploy “responsible encryption“. What, exactly, is “responsible encryption”? Well, that’s a bit of a problem. Nobody on the government’s side of the debate has really been willing to get very specific about that. In fact, a recent speech by U.S. Deputy Attorney General Rod Rosenstein implored cryptographers to go figure it out.With this as background, a recent article by GCHQ’s Ian Levy and Crispin Robinson reads like a breath of fresh air. Unlike their American colleagues, the British folks at GCHQ — essentially, the U.K.’s equivalent of NSA — seem eager to engage with the technical community and to put forward serious ideas. Indeed, Levy and Robinson make a concrete proposal in the article above: they offer a new solution designed to surveil both encrypted messaging and phone calls. In this post I’m going to talk about that proposal as fairly as I can — given that I only have a high-level understanding of the idea. Then I’ll discuss what I think could go wrong.

Give Up the Ghost: A Backdoor by Another Name

Government Communications Headquarters (GCHQ), the UK’s counterpart to the National Security Agency (NSA), has fired the latest shot in the crypto wars. In a post to Lawfare titled Principles for a More Informed Exceptional Access Debate, two of Britain’s top spooks introduced what they’re framing as a kinder, gentler approach to compromising the encryption that keeps us safe online. This new proposal from GCHQ—which we’ve heard rumors of for nearly a year—eschews one discredited method for breaking encryption (key escrow) and instead adopts a novel approach referred to as the “ghost.”But let’s be clear: regardless of what they’re calling it, GCHQ’s “ghost” is still a mandated encryption backdoor with all the security and privacy risks that come with it.

Blog Post Image Courtesy of Electronic Frontier Foundation

Share this page:








Subscribe to our newsletter

Get the latest CommsLock news, product offering & free downloads right in your inbox


Do you accept our Privacy Policy?