-1

Most of the issues raised by Apple and others about giving the FBI access to the terrorists iPhone, seem to stem from a concern about reducing the overall security for regular users.

Does this necessarily have to be the case though? Would there be safe ways to allow the FBI access encrypted devices, but nobody else?

I mean in general, not just for this specific case.

Surely it would be desirable for the FBI to be able to recover whatever data they need to do their job effectively.

Personally I don't have any concerns about the FBI having access to my phones and computers. I really don't like the idea that criminals can conceal their data so easily, and have it completely beyond the reach of law enforcement.

user1751825
  • 905
  • 4
  • 10
  • 3
    The question about whether this would effect overall security is also asked in this Information Security question on Stack: http://security.stackexchange.com/questions/114897/apples-open-letter-they-cant-or-wont-backdoor-ios/114954#114954 – elika kohen Feb 19 '16 at 00:44
  • I'm going to make a very general statement to avoid getting political, so don't take offense, but... those "criminals" you don't want to be able to keep their data from law enforcement? You're one of them. (And so am I, and so is every other American adult.) At least you're in good company, I guess. In keeping with trying to avoid getting political, I will cite the example of a federal law that was mercifully struck down for being so broadly worded as to make it a federal criminal offense to browse the web or perform anything personal while "on company time." – HopelessN00b Feb 19 '16 at 07:37

4 Answers4

5

The only answer: NO

And there is a reason for that: The code exists. If you program a backdoor, that code exists somewhere in the world, and thus it can be exploited.

A Security system is only as good as the weakest part of the whole system.

This means that if a backdoor is placed into the device, the device is only as secure as the backdoor, and if that gets reverse engineered or someone finds an exploit, NO DEVICE IS SAFE.

So no, they shouldn't do this. It would completely defeat the security of all users, not just the ones that get taken for legal reasons.


An Example:

To make this a little easier to understand, lets use your home as an example.
You obviously have a lock on your front door. How secure is your front door? Well it should be as secure as the keys you keep for the lock. So if you have a key, and you give one to your friend, then it's as secure as your friend keeps the key(you're probably not going to break into your own home, right?)

Now assume your friend wants to go out of town, but you still want someone else to have a copy of your key. So your friend gives a copy to his friend. His friend is trust worthy, and won't lose it, your friend leaves town, and when you needed it you were able to talk to your friends friend to get into your home.

AWESOME! This has kept you safe. For a bit. Now your friend gets robbed. Suddenly someone else has a copy of that key to your home. If they find your home, they can get into your home and take whatever you want.

OH NO IT GETS WORSE! Now someone comes to your home and uses a tool designed to gain entry to locks and look legitimate to all forms of investigation, but really is just using a tool that makes the lock open. Now someone who doesn't even have the key, but a good understanding of the lock can get access to your home too.


Now in the above example replace home with iphone, key with security measures, and stolen with... well stolen. If someone steals that information, they have access to it.

Of course now lets make your key have access to your home, your work, your bank, and your car. Suddenly things get a lot worse don't they?

Robert Mennell
  • 6,968
  • 1
  • 13
  • 38
  • Not to mention that it only takes 1 person to find it among the entire world's population, and there is A LOT of motivation to find it not just among hackers but the pranksters. Just imagine "The Fappening" to an exponential level. That cute girl from accounting have an Iphone? Hacked as easy as paying a 100 bucks on some hacker forum that uses and automated tool to backdoor into her phone and export her data. – GingerBeard Feb 19 '16 at 00:28
  • 1
    It's actually worse than your example. Instead of getting the best possible lock for your front door, you choose one that can be opened with both your key and a master key. No one quite knows what the master key is, or who has access to it, but everyone knows it exists. And once they have this master key, they can open not just your door, but all the doors. Or in other schemes you get a lock with a flaw designed to open it without the key. No one really knows which flaw, but everyone there know is *a* flaw. Lots of people will be looking at the lock to find that flaw. – Martin Tournoij Feb 19 '16 at 00:29
  • But what if the only means of entry is via a purpose built key generation computer securely locked in an FBI vault? – user1751825 Feb 19 '16 at 00:37
  • I understand your example about giving a house key to a friend, but if your spare house key is permanently locked within FBI headquarters, and you would literally have to pick up your house and transport it to FBI headquarters to have the door unlocked, would this not be secure enough? – user1751825 Feb 19 '16 at 00:40
  • @user1751825 Part of this presumes trust in a government that has already proven it intends to do mass surveillance. Apple may see good in the FBI, but why do they need the device when Apple has already provided, under court order, location and iCloud data and cell providers can provide text messages and call logs? That's my personal opinion, so take it for a grain of salt, but these are questions that **I** am asking. – h4ckNinja Feb 19 '16 at 00:43
  • @user1751825 No, because someone can come to your house and impersonate the FBI with something meant to open locks without the actual key. Hence the key example. It shows that if there is a back door, someone must merely either steal the key, or use a tool designed to get through the lock. I'll go ahead and update the post. – Robert Mennell Feb 19 '16 at 18:25
1

Actually I just thought of a very significant issue with this idea. If the FBI is given access to encrypted devices, then it's entirely likely that law enforcement in all other countries where Apple and Google operate would also demand the same level of access. Even if the FBI can be trusted to only use this capability for legitimate purposes, intelligence services in many/most other countries absolutely could not be trusted to do the same.

user1751825
  • 905
  • 4
  • 10
  • Also, fusion centers. So, not just other countries, but also just about every state and local law enforcement agency in this country, too. I can't help but think that eventually they'll get bored of shooting unarmed people in the back... I don't want them to move on to poking around through my personal, private data for a change of pace. – HopelessN00b Feb 19 '16 at 07:10
0

There is at least one example of a strong crypotgrpahic algorithm that is known to contain a government backdoor, which is only accessible to the government agency that created it (the NSA). It's called Dual_EC_DRBG, and you can read an answer I recently posted about it, if you wish.

So from a purely technical standpoint, it is possible to engineer an encryption algorithm which has a backdoor that can only be exploited by the government agency that designed it. We know this is possible, because we know it has happened at least once. So, "yes", it is technically possible to create a backdoor which only a particular government agency can access.

Having said that, no, there are no "safe ways to allow the FBI access encrypted devices, but nobody else."

While we can create a backdoor that only the government can walk through, the fundamental problem and danger of doing so (beyond concerns about it being abused by the government agency that uses this backdoor) that cannot be eliminated, is that once that information is walked out that door and stored in other locations, it can be accessed by anyone who can get access to any of those locations. So (complete best case scenario), we go from having information in one place, that is only accessible to you and the FBI, to having that information in many places, and it being accessible to you, the FBI, and anyone who can access wherever the FBI has stored it. Not incidentally, the FBI can't even keep its own information safe, let alone information it has on other people.

And then we go to also having the information stored wherever it's put by the other government agencies the FBI shares it with, and also being accessible to anyone who has access to those locations as well. Thanks to the Department of Homeland Security, that is approximately every national intelligence and law enforcement agency, as well as the majority of state local law enforcement agencies, via "fusion centers." Oh, and, to make it worse, it's not like it's "just" every intelligence agency, law enforcement organization, and courthouse that would have access to your information, either. Perhaps you've heard of the OPM data breach, where 21.5 million people had their personal information (and information on their friends, family and co-workers) stolen. So, it's potentially oversight and administrative agencies too.

And to put a really fine point on this fundamental problem government agencies have with keeping data in their possession safe, let's go back to the Dual_EC_DRBG backdoor thing I opened with.

The whole reason we know, conclusively, that Dual_EC_DRBG contains an NSA-only backdoor is because a man named Edward Snowden accessed the information about this backdoor and shared it with the world (among other things, of course). Importantly, Snowden was not even an NSA employee, but a contractor. This information is/was Top Secret, Sensitive Compartmented Information, which is the most strongly protected categorization of information our government has. This was a piece of information that was so highly protected that the mere act of disclosing it was an act of treason. In theory, it was only accessible to some small number of people within the NSA.

This was a piece of information that was protected by multiple layers of technical protections, multiple layers of processes protections designed to keep it out of unauthorized hands (Snowden's unauthorized hands), and the legal threat of being charged with treason. It wasn't enough. All that couldn't keep it secret. As a brief tangent, our government couldn't keep Top Secret State Department cables and military operations details safe, either... so it's not like Snowden was the only example of a high-profile data breach and leak of large amounts of Top Secret information.

Despite all those protections, this piece of information is now public knowledge, because someone who had access to the NSA's systems, someone not in that small number of NSA employees authorized to know it, accessed it and shared it with everyone.

Since the government can't keep its most closely guarded secrets safe, what hope do you really think there is that it will be able to keep the contents of your iPhone safe?

HopelessN00b
  • 3,385
  • 19
  • 27
  • Comments are not for extended discussion; this conversation has been [moved to chat](http://chat.stackexchange.com/rooms/35958/discussion-on-answer-by-hopelessn00b-securely-enable-fbi-backdoor-for-phones). – Rory Alsop Feb 19 '16 at 08:11
-2

Question:

Most of the issues raised by Apple and others about giving the FBI access to the terrorists iPhone, seem to stem from a concern about reducing the overall security for regular users.

Does this necessarily have to be the case though?

Would there be safe ways to allow the FBI access encrypted devices, but nobody else?

Answer:

You are misunderstanding the issue -- a little. The FBI isn't asking Apple to create a "Back Door" -- the "Back Door" already exists -- they are asking Apple for a more efficient interface to exploit the already existing Back Door.

The only reason the court order is even feasible is because Apple has stored the actual AES decryption key on the phone in the first place -- which is only protected by a simple passphrase -- at the convenience of the user.

This is an intentionally implemented security vulnerability -- more common with Hardware supported encryption that relies on TPM-like solutions.

elika kohen
  • 292
  • 1
  • 9