38

A common saying among people in the field of cryptography and security is that when providing a back door to law enforcement, you also provide a back door for hackers.

I was trying to examine the implementation of Lawful Interception from 4G and the proposed implementation in 5G and to me it looks secure. The only way for a hacker to gain information that they shouldn't would be if they knew the private key of the base station.

If we assume that the private key of the base station is secure, what could a hacker do that they could not have done without Lawful Interception being implemented?

schroeder
  • 123,438
  • 55
  • 284
  • 319
finks
  • 467
  • 4
  • 4
  • 8
    If we assume that everything is secure and everyone working in law enforcement is trustworthy then nothing. What exactly do you mean with "the private key of the base station is secure"? Its cryptographically secure? It is securely stored? It is impossible for anyone to copy it? ... – Josef May 10 '19 at 08:26
  • 41
    what if law enforcement gets hacked? – Pizza lord May 10 '19 at 08:27
  • 45
    [History proves you wrong](https://en.wikipedia.org/wiki/Greek_wiretapping_case_2004%E2%80%9305). Similar "features" has been abused in the past. – vidarlo May 10 '19 at 08:37
  • 23
    What if your ex husband works at the police, or the neighbor that you're having a dispute with? Can the police be bribed or exorted because of debts? etc. – pipe May 10 '19 at 13:43
  • 1
    What if your ex husband works at the cell tower company, or the neighbor that you're having a dispute with? Can the cell tower worker be bribed or exorted because of debts? etc. (Thanks, pipe!) – longneck May 10 '19 at 16:48
  • 11
    Note that social engineering is a subset of hacking. – Roman Odaisky May 10 '19 at 22:04
  • 6
    -1 for a bad distinction at odds with security. From the perspective of anyone whose privacy is compromised, there is no distinction between an attacker who has the backing of the law/the state, and one who does not, except that the victim *has some recourse* against the latter and not against the former. – R.. GitHub STOP HELPING ICE May 10 '19 at 23:15
  • I'd worry much more about **both** base stations and infrastructure as well as cellphones being supplied by an officially employee-owned company that is not just believed to work closely with one particular democratic people's government, but was also founded and is owned the people's liberation army. That's not taking into account that _this same company_ is going to supply WLAN for all high-speed trains in one major western country. Fell like having paranoia? Start there, not with law enforcement. – Damon May 11 '19 at 11:25
  • The [Greek wiretapping case](https://en.wikipedia.org/wiki/Greek_wiretapping_case_2004%E2%80%9305) proves that private keys are no protection against tampering. [Stuxnet](https://en.wikipedia.org/wiki/Stuxnet) showed just how *easy* it can be to install wiretapping malware. And both of these events took place over 10 years ago. What if a certain modern base station vendor is as sloppy with base stations as they are with their smart phones? – Panagiotis Kanavos May 13 '19 at 07:51

3 Answers3

76

Without access to the key, then the problem for attackers is the same as if there was no backdoor key: the attackers would have to break the encryption itself.

But ...

If we assume that the private key of the base station is secure

Your base assumption is the one that requires challenge. That there is a key is the problem.

  • key handling
  • key misuse
  • key leakage
  • key strength
  • key protection

Each one of these elements needs to be secured for the key to be secure. And there are a lot of moving parts there and a lot of ways for people to cause weaknesses and ways for malicious actors to manipulate controls to their advantage.

Even if we perfectly trusted all law enforcement not to be malicious, ever (a sensitive topic on its own, but of course, impossible) then there are still lots of ways for weaknesses to creep in or for trusted people to be manipulated.

Once the door is there, it will become the intense focus of those with time, resources, and strong desire wanting access. How resilient will those with legitimate access be against such an onslaught? How perfect will those people be in engaging in the established procedures even without external pressures?

Once you cut a hole in a wall, it becomes a point of weakness. The strongest lock will not compensate for hinges that can be broken.

schroeder
  • 123,438
  • 55
  • 284
  • 319
  • 23
    In addition, as demonstrated in Greece, the functionality provided may be abused by malware installed on the equipment. Not providing the functionality will make an attack harder. – vidarlo May 10 '19 at 08:57
  • 55
    The problem is that lawmakers have such an intense desire to be able to spy on people. They have no understanding for (or completely disregard) the resulting overall weakness of the system. **Any backdoor - including a "lawful" backdoor - can and at some point will be abused!** –  May 10 '19 at 10:39
  • 18
    Quite. Social engineering of officials is always an issue in a setup like this, too. In the UK there is a mechanism where people have their number omitted from public directories. Ten years ago, through using a corrupt phone company official, a hitman tracked down someone in witness protection. It cost them <$200, if I remember the case right, possibly less than their train fare. Presumably similar access will be possible to criminals with this new functionality. At some point someone who controls key access will be bought. – Dannie May 10 '19 at 12:53
  • 4
    Note that one of the big "key" problems is related to the _number_ of keys. Because of the way cellphones are supposed to work, you want them to automatically connect to any trusted tower, whether it's down the street or the destination country of your international trip. An attacker only has to compromise _one_.... – Clockwork-Muse May 10 '19 at 21:30
  • @MechMK1 By the same logic **Any 'frontdoor' can and at some point will be abused!**. Of course adding an additional way to access a system will weaken the system compared to having less or no possible access (for anyone) at all. It's easy to say that less access is safer, but that's a completely meaningless statement. As a security expert our role is not to build the safest systems, it's to build as-safe-as-possible systems given the requirements. – David Mulder May 11 '19 at 09:48
  • 9
    @DavidMulder I think you're getting hung up on the "back door" analogy. It's really more like a skeleton key. Back door doesn't just mean "another door", but a way of bypassing the security measures on the front door. I have a passphrase that lets me decrypt my encrypted data. Bob has his own passphrase that lets him decrypt his encrypted data. My front door requires entering my pass phrase to access my data. Bob's front door requires him entering his pass phrase to access his data. The government has a back door that doesn't require anyone's pass phrase, and lets them access EVERYONE's data. – barbecue May 11 '19 at 12:45
  • @barbecue Except that's not the case, the government has a passphrase which allows access to both pieces of data. Or if my minimal understanding is correct: The government has many pieces of data (one per base station) which allows access. It's not bypassing, it's just another door/key. The entire point of my comment was that the government having access is going to be as secure or insecure as any other access. Just because it's another party doesn't make it inherently good, bad, more or less secure. – David Mulder May 11 '19 at 13:37
  • 3
    @DavidMulder Even if that is true, it's still the same point. If I get access to the master key for an apartment building, that lets me open every "front door" to every apartment. Does that mean my master key is a "front door" and not a "back door"? Back door is a bad analogy, because it suggests a second entrance equivalent to the main one, but that's not what we are talking about here. We're talking about a master key. – barbecue May 11 '19 at 14:12
  • 2
    The fact that a separate key is required for each base station doesn't change the issue. For this back door to be equal in risk to the regular front door, the government would need to have one employee for each cellular customer who has access only to the key needed to access that customer's data, and those employees would need to be unable to exchange information with each other. We're talking about an adversarial relationship, where an owner wants to PREVENT access, and an adversary wants to ignore the owner's wishes and bypass their protections. – barbecue May 11 '19 at 14:17
  • @barbecue But the base you should reason from is: We have three parties who we want to have access by design to a piece of data. We are **not** talking about three people, so where you got the 1:1 employee:customer thing from I have no idea. This third party will likely want to achieve two things: Define which employees can use this access and log who uses this access. That's frankly a classic problem which in normal secure business have to be solved all the time. And something that's far lower risk than the first two parties (as with those the devices used can not be controlled for). – David Mulder May 11 '19 at 14:27
  • 2
    Point is: Designing such a system in a secure manner is absolutely possible. "Should we?" is the question, not "Can we?" and when the cryptographic community (a community that rightly so cares about privacy) brings false arguments to the table ("all backdoors are insecure") this just leads to a falsifiable narrative instead of discussions about the real question. – David Mulder May 11 '19 at 14:29
  • Let us [continue this discussion in chat](https://chat.stackexchange.com/rooms/93522/discussion-between-barbecue-and-david-mulder). – barbecue May 11 '19 at 17:39
27

While I agree that every point of schroeder's response is true, there are two deeper issues that make it so much more dangerous than the current model of security. Right now, if you install an encryption key on a system, that key only controls your system and can only be accessed by the people you trust to access your system.

Breaking into any system is a question of economics to any would be attacker. Let's say for example, that your system is a database of 5000 clients with 5 users who can access it from a single network. A hacker has very few possible access points to try to exploit, and the odds of him finding a misconfiguration are relatively small; so, they need to ask themselves if they can spend little enough time and money getting into your system to make those 5000 client records worth their investment. If this network was set up by a mostly competent person, the answer is probably no.

A national back-door could be just as well encrypted, but it would expose hundreds of millions of devices to thousands of law enforcement agents. Hundreds of separate networks will be set up by sysadmins of various skill levels. Many of the cops you trust to access the system will not be properly trained in cyber security. In this case, a hacker only needs to target one of the many, many people, devices, or networks who create these weak links. This makes intruding on the system several orders of magnitude easier, while making the pay off several orders of magnitude greater.

In fact, the payoff is so much greater, it would be worth it to many hackers and national governments to go through the training process of joining law enforcement just with the goal of gaining access to this system.

Nosajimiki
  • 1,799
  • 6
  • 13
17

If there's a backdoor, it will be abused. The question is when, not if it will be abused.

There are too many actors that could compromise such a system, and no easy way to plug the holes. If a private key leaks, it's done. It's cheaper to all involved to ignore the leak until there's a high profile case blowing to the press. Changing every key on every base station will require a lot of work.

If the feature exists, no amount of red tape will protect it. People can be bought. People commit software to the wrong repository, send email to the wrong address, copy keys to USB drivers and lose them. And it takes only one leak.

And stealing the key is not the only option: convincing an authorized operator or infecting his computer are possible options too.

ThoriumBR
  • 50,648
  • 13
  • 127
  • 142
  • But the same is true for "frontdoors", an intended backdoor isn't a backdoor at all. It's not bypassing normal authentication and it's not secret. It's just another door into the system. Is having two doors less secure than one? Yes, of course. Does that mean it will be abused? Just as likely as any other door. Of course the risk assessment of a base station being compromised and a phone being compromised are different, but likely that's reflected also in the difficulty (getting a user to install malware on a phone is a lot easier). (Not saying it's a good idea or not, just shallow answer) – David Mulder May 11 '19 at 09:52
  • @DavidMulder as [the Greek hacking scandal](https://en.wikipedia.org/wiki/Greek_wiretapping_case_2004%E2%80%9305) showed though, the "backdoor" isn't as well tested as the front door for the simple reason it's not used as much. In fact, that's how it was detected - enabling the backdoor caused dropped SMS messages. The Greek PM complained to Vodafone leading to the hack's detection. – Panagiotis Kanavos May 13 '19 at 07:42
  • @DavidMulder The hack [wasn't simple](https://web.archive.org/web/20070704193410/http://www.spectrum.ieee.org/jul07/5280) and probably involved former Ericsson employees. That doesn't mean it wasn't significant though. Whoever wants to hack base stations (nation or mob) probably has enough means to find, bribe or extort the necessary people. Remember Stuxnet too - why bribe the base station technician when you can install malware on their machines? What if the *vendor* is as sloppy with the base stations as they are with their smartphone security? (I do have *that* vendor in mind) – Panagiotis Kanavos May 13 '19 at 07:54
  • @PanagiotisKanavos Totally agree that all access points to a system need to be *seriously* tested and often they are not, but that's not a principal problem. – David Mulder May 13 '19 at 08:22