168

Apple released an open letter to the public outlining their reasons for not complying with the FBI's demands to modify the iPhone's security mechanism.

Here's a summary:

  • The FBI has an iPhone in their possession which they would like to access data from. The phone is locked and fully encrypted.
  • After failing to get into the phone, the FBI asked Apple to unlock the phone.
  • Apple said since the phone is encrypted, they can't get into it either.
  • The FBI asked Apple to modify the iPhone OS to enable brute force password attempts electronically. (Currently the passwords can only be entered in via the manual interface, and is limited to 10 attempts.)
  • Apple refused. They believe it would be too dangerous to make that change because in the wrong hands it would undermine the security of all iPhone users, even if they only used the software in this instance.

I understand Apple's position of not wanting to make the change, particularly for new phones, but it's unclear whether the change could actually be made and installed on an existing locked and encrypted phone. Could they actually accomplish this for an existing encrypted phone? If yes, then isn't simply knowing this is possible also undermining the security? It seems to me it would be just one step removed from the backdoor they are trying to keep closed.

Update: since this is a security forum, I feel it is important to point out that Apple is using the word backdoor differently than we typically do on this site. What the FBI has asked Apple to do would not result in a backdoor by the typical definition that we use, which is something akin to a master key. Instead, in this case, if Apple were to comply, the FBI would then be able to attempt to brute force the passcode on the phone. The strength of the passcode would determine whether they are successful in gaining access. Based on Dan Guido's article (linked to in Matthew's answer), if each passcode try takes 80ms, then the time needed to brute force the passcode would take, on average (by my calculations):

  • 4 digit numerical passcode: about 7 minutes
  • 6 digit numerical passcode: about 11 hours
  • 6 character case-sensitive alphanumerical passcode: 72 years
  • 10 character case-sensitive alphanumerical passcode: 1 billion years

Obviously if a 4 or 6 digit numerical passcode was used, then the brute force method is basically guaranteed to succeed, which would be similar to a backdoor. But if a hard passcode is used, then the method should probably be called something other than a backdoor since gaining access is not guaranteed, or even likely.

Update 2: Some experts have suggested that it is theoretically possible for the FBI to use special tools to extract the device ID from the phone. Having that plus some determination and it should be possible to brute force the pin of the phone offline without Apple's assistance. Whether this is practically possible without destroying the phone remains to be seen, but it is interesting to note that if it can be done, the numbers I mentioned in the above update become meaningless since offline tools could test passcodes much faster than 80ms per try. I do believe that simply knowing this is possible, or even knowing that Apple can install new firmware to brute force the passcode more quickly, does imply a slightly lessened sense of security for all users. I believe this to be true whether Apple chooses to comply with the order or not.

There are multiple excellent answers here, and it's very difficult to choose which one is best, but alas, there can be only one.

Update 3: It appears that the passcode to unlock the phone was in fact simply a 4 digit code. I find this interesting because this means the FBI asked Apple to do more than was necessary. They could have simply asked Apple to disable the wipe feature and timing delay after an incorrect attempt. With only those 2 changes one could manually attempt all 10,000 possible 4 digit codes in under 14 hours (at 5 seconds per attempt). The fact that the FBI also demanded that Apple allow them to brute force electronically seems odd to me, when they knew they didn't need it.

Update 4: It turns out the FBI was able to unlock the phone without Apple's help, and because of this they dropped their case against Apple. IMO, overall this is bad news for Apple because it means that their security (at least on that type of phone) was not as strong as previously thought. Now the FBI has offered to help local law enforcement unlock other iPhones too.

TTT
  • 9,122
  • 4
  • 19
  • 31
  • 14
    Only Apple knows for sure. The informed speculation I've seen at this point suggests it is likely that they can for the phone in question (an A6 based iPhone 5C) but probably not for newer A7 based phones. – Xander Feb 17 '16 at 14:32
  • Comments are not for extended discussion; this conversation has been [moved to chat](http://chat.stackexchange.com/rooms/35911/discussion-on-question-by-ttt-apples-open-letter-they-cant-or-wont-backdoor). – Rory Alsop Feb 18 '16 at 12:41
  • 2
    Cf. http://www.thedailybeast.com/articles/2016/02/17/apple-unlocked-iphones-for-the-feds-70-times-before.html. They can, and they have. It's a PR stunt. (For some reason I cannot answer or I'd make it one.) – Peter - Reinstate Monica Feb 19 '16 at 01:08
  • 4
    @PeterA.Schneider, I think the following quote from that article is relevant: *"It wasn’t until after the revelations of former NSA contractor Edward Snowden that Apple began to position itself so forcefully as a guardian of privacy protection in the face of a vast government surveillance apparatus. Perhaps Apple was taken aback by the scale of NSA spying that Snowden revealed. Or perhaps it was embarassed by its own role in it."* – Wildcard Feb 19 '16 at 06:31
  • 8
    Note: The FBI requiring them to open it doesn't mean that the FBI actually needs their assistance. It might as well be the FBI already has all it needs; and that the FBI is trying to create a precedence based on a hundreds-years-old law and a terrorism case to force Apple into cooperating on the backdoor, just for future cases. – FooBar Feb 19 '16 at 15:40
  • 2
    @PeterA.Schneider except prior to iOS 8 they didn't need to actually create modified versions of the system to access the data. http://techcrunch.com/2016/02/18/no-apple-has-not-unlocked-70-iphones-for-law-enforcement/ – Luke Feb 20 '16 at 20:11
  • 1
    I'm disappointed if there's no incremental lockout after multiple failed pin codes. It was probably a stalling tactic for PR. – Phil Lello Mar 21 '16 at 22:25

6 Answers6

67

Various commentators suggest that this would be possible, on the specific hardware involved in this case. For example, Dan Guido from Trail of Bits mentions that with the correct firmware signatures, it would be possible to overwrite the firmware, even without the passcode. From there, it would be possible to attempt brute force attacks against the passcode to decrypt the data.

It appears to not be possible if the firmware replacement is incorrectly signed, and the signing keys have been kept secure by Apple so far.

He also mentions that this wouldn't be possible on some later devices, where the passcode check is implemented in a separate hardware module, which enforces time delays between attempts.

Edit Feb 2017: Cellebrite (a data forensics company) have announced the capability to unlock and extract data from most iPhones from the 4S to the 6+, strongly suggesting that a flaw exists somewhere, which they are able to exploit. They haven't released full details of this.

Matthew
  • 27,233
  • 7
  • 87
  • 101
  • 3
    It's also been suggested that firmware updates to the separate hardware module would cause the existing keys to be erased. – Dietrich Epp Feb 17 '16 at 15:26
  • 4
    Wouldn't apple/FBI/some random person with required hardware be able to dump the hard drive by manually connecting to it? I saw that being done for other devices with JTAG. – ave Feb 17 '16 at 15:29
  • 6
    @ardaozkal Depends how the internal system works. It's not a distinct unit in the way that laptop hard drives are, but an integrated part of the main board. That means that it's entirely possible that the read and write methods pass through other parts of the device, which might include the encryption parts. In that case, you'd get a copy of the encrypted data, but still couldn't decrypt it - the passcode isn't the whole encryption key, merely one part of it. Brute forcing the whole drive would be virtually impossible. – Matthew Feb 17 '16 at 15:33
  • 21
    Don't underestimate NSA's servers :P Jokes aside, I'd guess that FBI tried this already, so much that they got desperate enough to publicly ask Apple to open a backdoor. – ave Feb 17 '16 at 15:40
  • @Matthew Thanks for the link. That's a fantastic article. – TTT Feb 17 '16 at 17:21
  • @Matthew - the article suggest that even if Apple complied with the request, at 80ms per password attempt, brute force might still not succeed any time soon, depending on the strength of the password. – TTT Feb 17 '16 at 17:57
  • @ttt Very true, but it's a lot better than anything up to an hour between tries! – Matthew Feb 17 '16 at 17:58
  • 1
    @ardaozkal: Dumping the contents of the drive is no problem. But the encryption key is made from several parts, and one is built into the CPU. Without the CPU in the phone, your only choice is to brute force one 256 bit key per file on the device. – gnasher729 Feb 17 '16 at 19:43
  • @gnasher729 I wouldn't bother with getting a part of CPU to work elsewhere, but I'm sure that FBI has the manpower and knowledge required to do that. – ave Feb 17 '16 at 19:51
  • 5
    @DietrichEpp Dan Guido has corrected himself--The "Secure Enclave"'s time delay behavior can and has been re-programmed without erasing the key, and this can be used to brute-force newer iPhones. The takeaway, however, is that using a strong, brute-force-resistant alphanumeric password is believed to be a good defense. – Aleksandr Dubinsky Feb 17 '16 at 22:28
  • 1
    To clarify, the cracking rate with the brute-force-friendly ROM is 12.5 passwords/sec, which is actually rather great (for the user). However, the default 13- and 20-bit entropy passwords (4 and 6 digit PINs) become too weak (7 minutes and 11 hours to crack). 33 bits is a minimum (10 years). This corresponds to 10 random digits, 7 lowercase letters, 5 ASCII characters, or 3 words. – Aleksandr Dubinsky Feb 18 '16 at 07:50
46

After doing some research, I now believe this is possible, but that it isn't very easy. Without getting too technical, if you look closely, Apple repeatedly implies that they can do it:

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers.

But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

Building a version of iOS that bypasses security in this way would undeniably create a backdoor.

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

If they couldn't do it, it could be a moot point to respond to this in such a way. I would say that it's possible for them to implement such a feature, if they want.

Mark Buffalo
  • 22,498
  • 8
  • 74
  • 91
  • 31
    Playing devil's advocate, they may be saying that they don't know if it's possible, but these are the reasons they aren't going to try. – Jason Feb 17 '16 at 15:15
  • 3
    @Jason I agree. – Mark Buffalo Feb 17 '16 at 15:18
  • 1
    I interpreted the article exactly the way you just described. As I imply in the last sentence of my question, my worry is that if it is possible, then whether the feature exists *yet* may not be relevant. The FBI might only be $10 million and an ex-iOS developer away from getting what they want. – TTT Feb 17 '16 at 16:30
  • 7
    @TTT It can't just be any old ex-iOS developer, it needs to be somebody who has access to the private key Apple use to sign the firmware. The backdoor can *only* be created by someone with that private key, because otherwise they can't persuade the phone to install the backdoor. – IMSoP Feb 17 '16 at 18:41
  • 2
    @IMSoP - Ah, yes. Maybe $50 million and a current iOS developer? Hehe- jk. I think it's safe to assume that the few people that have access to the private key can't be bought. – TTT Feb 17 '16 at 18:57
  • @TTT Precisely. I doubt these kind of folks can be bought. Money isn't everything... integrity is far more important than that, and I'm sure those devs have that... at least I'm hoping they do. :-( – Mark Buffalo Feb 17 '16 at 19:01
  • 13
    @MarkBuffalo Integrity? Don't need go that far. It is money, billions of dollars. How much market value would iOS (and Apple) lose if its passcode security was blown to rubble? Right now Apple has proof that even the FBI cannot break it. I can't think of a better ad. And I don't think someone can lawfully use Apple's PK without getting said billion-dollars used on lawyers all over one's head. – Mindwin Feb 17 '16 at 20:13
  • 1
    @Mindwin - But the billions you speak of are for the company, not the employees. It's probably not the case that all the people with access to the keys are extremely wealthy, and if not they could potentially be bought if they didn't have integrity. – TTT Feb 17 '16 at 20:37
  • 4
    Don't underestimate the stock options :) but access to these sorts of keys and signing operations typically requires the co-operation for two or more people in nuclear-launch-turn-the-keys style of process. A "bad guy" would need to convince more than one employee to Help them. Even then they may not have access to the key, merely a process that takes a binary and signs it – Paulw11 Feb 17 '16 at 20:49
  • 1
    And after convincing more than one employee to help the FBI, they'd need to steal the HSM(s) with the key on it. Call Tom Cruise, because that mission sounds impossible. – Phil Frost Feb 18 '16 at 01:36
  • 1
    @TTT are they ever for the little guy? but I digress. The private key is probably split in several pieces and nobody (not even Tim Cook) should have access to the whole alone. – Mindwin Feb 18 '16 at 11:48
  • 2
    @Mindwin Tim Cook should *certainly* not have access to the whole private key alone, because I can't see how he has any valid business reason whatsoever for having access to the private part of the firmware signing key. With this kind of data, access should be *exclusively* on a strict "need to know" basis, and only enough people to ensure continuity in case a reasonable subset are unavailable. Anything more simply invites disaster. – user Feb 18 '16 at 15:22
  • @Phil Frost: "Impossible" is a big word. Really big. Consider this: a large scale, boots-on-the-ground armed raid plus a planeful to a nonexistent extralegal prison. That is well within the *means* of a determined agency. Would that be legal? Probably not. Would that be an open declaration of a police state? Indeed. (Would that get popular support? That would depend on opinion makers; e.g. bg has already indicated so.) So, not impossible - just unlikely. – Piskvor left the building Feb 24 '16 at 14:10
  • @Piskvor It's a reference to a movie: [Mission Impossible](https://en.wikipedia.org/wiki/Mission:_Impossible_(film_series)). – Phil Frost Feb 24 '16 at 14:19
  • @PhilFrost: Yes, I'm aware of this. To adapt my comment to that pop-cultural reference: why would you bother with a delicate black-op, when you could roll through the front gate with tanks and take whatever you want, 800-pound-gorilla style? As for direct effect, both get you what you wanted; as for PR, both is a declaration that the state no longer feels bound by its own laws. One is a bit easier to pull off, IMNSHO. (Again, I do not find this *likely* to happen, as it's more of a South-American junta scenario; just note that things like that do happen even in countries presumed-civilized) – Piskvor left the building Feb 24 '16 at 14:22
26

Could they actually accomplish this for an existing encrypted phone?

Yes. They could provide a compiled image of os with anti-bruteforce features disabled. The fact they're making an open letter IMHO means they've already exhausted all excuses to not do so, implying they are fully capable of doing it.

They would have to be able to auto update a phone that they don't have access to.

No. They would provide binaries to the FBI. The FBI has physical access to the phone and can flash it. They can't prepare such image themselves because iPhone checks signature for Apple private key. Actually this key would enable FBI to do everything themselves (well, at quite a cost of reverse-engineering), but they're not insolent enough to ask for it.

If they can actually do that, then isn't simply knowing this is possible also undermining the security? It seems to me it would be just one step removed from the backdoor they are trying to keep closed.

It is. Holder of such binaries could then take any iPhone 5C, flash it with this version, and bruteforce it easily. (Or to be exact, any model that can run 5C firmware correctly). This is not a future backdoor, it's a master key to every iPhone 5C you can physically get your hands on.

Matthew
  • 263
  • 1
  • 2
  • 9
Agent_L
  • 1,921
  • 14
  • 13
  • 5
    They would not be able to flash it with this image, as the image must be encrypted using the device's unique key (they have to flash physically, not logically!). – Henno Brandsma Feb 17 '16 at 18:59
  • Just FYI, I removed my sentence which you explained was incorrect. It wasn't adding value to the question, but I think your response to that sentence does add value to your answer- thx. – TTT Feb 17 '16 at 19:31
  • I've been reading mentions of this only working on 5Cs. Is this truly the case? Do you know what the difference is between the most current versoins (6S?) That is, is the FBI not interested in a "master key" to these devices, or..? – HC_ Feb 17 '16 at 22:54
  • 3
    From what I've read of the actual FBI order, the FBI has asked that the custom firmware which disables the anti-brute-force functionality would be locked to only the iPhone in question (ostensibly by checking the device ID and refusing to run otherwise). The order also allows for the phone to remain in Apple's possession. That would seem to address the concern that once out there, this software could be used on any phone. – Phil Frost Feb 18 '16 at 01:27
  • "They would provide binaries to the FBI. The FBI has physical access to the phone and can flash it." I think this is the issue of contention with Apple. If the FBI has access to the custom firmware, then they can flash any device they want. They would undoubtedly weasel their way to physical access to the phone once Apple flashed it and essentially have a backdoor that would work on all 5C. – n00b Feb 18 '16 at 20:49
18

Only Apple knows, but I'm going to guess they won't do it. I suspect the FBI has a pretty good idea what is and what isn't possible, especially since Apple has otherwise been cooperating with them. Also the people who work for the FBI aren't idiots, and I bet this isn't the first crime they've investigated with an iPhone.

Furthermore, Apple's argument against breaking this particular phone seems to be that they think such an action would compromise all phones. Despite the popularity of this belief, which is grounded on suspicion more than fact, The actual FBI order asks for a firmware which:

  • is limited to just the device in question
  • does not need to leave Apple's facility

Specifically, the FBI does not ask for:

  • an exploit that could be applied to any phone
  • access to the exploitable firmware
  • unsupervised access to the exploited phone
  • access to Apple's code signing key

IANAL, but I bet such things are unlawful. Even if you want to believe the FBI is a malicious organization, they won't ask for such things in a court order.

Here's the relevant section of the order, with interesting parts highlighted:

Apple's reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

So apparently, the FBI believes Apple could write this compromised firmware such that it would work only on the specific phone the FBI needs to access. Since firmware must be signed by Apple to work, it should not be possible for the FBI or anyone else to modify this compromised firmware to work on another phone. If that were the case, the FBI could just modify the current firmware, without Apple's help. So this would seem to address the concern that any phone could be compromised.

The FBI is also willing to give Apple the phone so the firmware never even needs to be in the FBI's possession. That would seem to address the concern that the firmware would "fall into the wrong hands". Even if it did, it wouldn't be exploitable given the previous provision.

Given these provisions in the FBI's order that seem to specifically address the concerns in Apple's letter, we can only guess what Apple's reason for refusing the order may be.

It's possible there's some technical detail that Apple isn't telling us, and of which the FBI isn't aware. Or perhaps the FBI decided to ask for something they already knew to be impossible.

It's also possible that Apple thinks this is good PR. Apple certainly has a financial interest in making the iPhone appear "unhackable even by the FBI". Apple may also be trying to leverage anti-government sentiment.

One could argue there is an ethical reason for not circumventing security measures on a phone, and that privacy is more important than all other concerns, even if that phone has been lawfully seized by by a federal bureau charged with invading people's privacy to investigate crime, and that phone belonged to now dead terrorists, and the phone could contain intelligence that could prevent further terrorist attacks. I'll be waiting for Schlage to release a similar public letter the next time a law enforcement agency wants to enter someone's home.

One could also argue that relying on what is probably a 4-digit numeric key to encrypt data is hardly secure, and whether Apple helps or not, the FBI will get in that phone. They may just need to perform some more difficult physical attacks. Breaking a pathologically weak key by brute force is hardly a backdoor. It's the front door.

Maybe the courts will sort it out, maybe not. Probably we'll never know.

Phil Frost
  • 725
  • 4
  • 10
  • The FBI also knows it can later remove or change the value the custom ID the firmware would be locked to and then use it on every other phone they get. – Andy Feb 18 '16 at 01:56
  • 7
    @Andy They could try, but then the signature on the firmware would be invalid and no phone would load it. Same as if they just tried to write their own firmware, or modify an existing firmware. – Phil Frost Feb 18 '16 at 01:57
  • 2
    Good point, I forgot about that aspect. – Andy Feb 18 '16 at 02:09
  • 3
    Once the brute-force-enabling iOS version exists in source code form, there is no way to guarantee it will not be copied, or the knowledge from the people who wrote it extracted. That is the danger, not the binary itself. – juandesant Feb 18 '16 at 06:25
  • 4
    @juandesant Yes, there is a way to guarantee it will not be copied. Firmware must be signed by Apple, and the people who control that key won't sign a firmware that contains a backdoor that would work on any device. If the security of the firmware was based on the obscurity of the source code, we are already screwed. There are many Apple engineers who already have the source code, and anyone could decompile it. Disabling the anti-brute-force functionality is probably a trivial matter of commenting a few lines of code. – Phil Frost Feb 18 '16 at 14:15
  • 1
    @juandesant - your argument is what Apple wants us to believe, but I'm not convinced that's true. Let's say Apple could write the software in 8 hours, keep the phone on premise and give the FBI the access they requested. After they are done (which could be a long time if the password is hard), they could delete the software, all of the code, and then they are 8 hours away from creating it again. (Or perhaps 4 hours away since they can probably produce it faster the second time.) My point is, the level of security risk to other users will not change whether they comply or not. – TTT Feb 18 '16 at 15:07
  • 1
    Regarding my last comment, I'm not saying I think Apple should comply, I'm just saying that I don't agree with the reason they are giving for not complying. – TTT Feb 18 '16 at 15:10
  • 1
    @TTT it is not a question of believing in Apple or not. It is a question showing that if security can be diminished on demand _just for a single case_. If you accept that, then you have to accept it _in every case_. So you have to draw the line now, or they will have won the perception that security trumps privacy always… and if that is the case, you've allowed police to always be able to watch you, and no phone will ever be able to be secure. – juandesant Feb 18 '16 at 19:17
  • 1
    @PhilFrost I'm afraid that is wishful thinking. You are talking about a very valuable asset, and digital assets are very difficult to erase. – juandesant Feb 18 '16 at 19:22
  • 3
    @juandesant - I suppose my point wasn't clear: Apple currently is in full control. They already have the power to install a different OS on any phone they want to if they have it in their possession. Whether or not they ever do it doesn't change the fact that they can. Therefore, overall security will not be increased nor decreased if they choose to do it. Besides, I'd bet they already install different versions of OSes all the time for testing purposes, with and without security features enabled. They would need to in order to test if the security features negatively affect performance. – TTT Feb 18 '16 at 20:03
  • 4
    Since I love analogies, here's one: it's sort of like putting a combination padlock on your storage facility. If you forget the combination, you can hire a service to cut your lock off. When they cut your lock off, that doesn't decrease the security of all other padlocks. The security of all other padlocks has already been slightly decreased merely by the existence of the service. Whether or not the service chooses to actually cut off a lock doesn't lessen it any more. – TTT Feb 18 '16 at 20:14
  • @juandesant It doesn't matter if you erase it or not, *because it has no value on any device but the target*. Why? Because the code must be signed, and the code only works on a device with the target device ID. – Phil Frost Feb 18 '16 at 20:30
  • @TTT I am not saying that they can't. What I'm saying is that asking them to do that is an unbearable burden, and the All Writs Acts does not apply. The courts have the ultimate say in this, but resisting is a right Apple has. – juandesant Feb 20 '16 at 15:50
  • @TTT but the existence of the service makes dedicated safes more attractive, if you want to increase security. – juandesant Feb 20 '16 at 15:57
  • @juandesant - I'm not sure if it's an unbearable burden but I agree they have the right to resist. And I agree with you that dedicated safes become more attractive as a result. After learning the details about how the iPhone security works, if I were a paranoid iPhone owner, I'd likely change my passcode to be a long alphanumeric code. – TTT Feb 21 '16 at 14:56
10

Question Restatement:

  1. Could [Apple] actually accomplish this for an existing encrypted phone?

  2. If yes, then isn't simply knowing this is possible also undermining the security? It seems to me it would be just one step removed from the backdoor they are trying to keep closed.

Quick Answer:

Yes, Apple can easily modify their minimal version of their iOS, (without the GUI) -- to have a brute force interface, (or anyone else if Apple uses their signing key on the firmware as the court order demands).

And Absolutely -- What the FBI is requesting isn't a "backdoor", but an interface to exploit the backdoor / vulnerability -- that is already there.

Any refusal on Apple's part doesn't remove the vulnerability, here's why:

Definitions:

Passcode and AES 256 Key: To be very clear -- the question, (and this answer) -- is whether a hacking interface can be made for the FBI -- to brute force the user's passcode -- not the underlying AES 256 Key -- which is protected by that user's passcode. The AES key in turn, is used to access the encrypted data.

The All Writs Act of 1789: requires 4 conditions be met -- the first which evidently is not met:

The absence of alternative remedies — the All Writs Act is only applicable when other judicial tools are not available.

This law ONLY can be invoked if the Burden of Proof is met -- IF AND ONLY IF there are absolutely NO other judicial tools available :

... This law does not allow the FBI to complain that the available methods it already has -- are inefficient compared to another that they don't currently have legal access to.

Three of Many Possible Methods -- Attack Trees:

The Question is whether the first attack is possible, despite the possibility of other remedies available to the FBI, (like the second):

  1. Hacking Apple, Highly Probable to Succeed: This Court Order mandates that Apple must allow the FBI to hack Apple's own Firmware, rather than the FBI hacking the User's encrypted data directly -- to exploit an existing iPhone vulnerability, and also exploiting Apple as a U.S. legal entity.

  2. Hacking the User's Data, Viable -- Incredibly Time Consuming: Alternatively, the Court order could have simply directed Apple, (or SanDisk), to Clone the User's Data for the FBI to copy, (YouTube Video of iPhone 6 SanDisk SSD.) -- and hack from a separate System avoiding the iOS data wipe feature, (AES Hacking Links).

  3. TPM / ROM Exploits: Methods that bypass the O.S. and even AES altogether are possible -- such as this incredible TPM Man in the Middle Attack :

    The Trusted Computing Group, which sets standards on TPM chips, called the attack "exceedingly difficult to replicate in a real-world environment." It added that the group has "never claimed that a physical attack - given enough time, specialised equipment, know-how and money - was impossible. No form of security can ever be held to that standard."

... And the FBI certainly has those resources, (along with the NSA).

Answer 1 - Apple's Compliance Wouldn't Decrease Security - but Exposes an Already Present Vulnerability:

Apple's response: acknowledges that there are already security vulnerabilities in their device -- (1.) themselves as an exploitable legal entity, and (2.) their hardware design.

Apple's response is Misleading -- simply a marketing opportunity to offset public awareness of this security vulnerability already present in their design, (and also Microsoft's Bitlocker--if configured to use the TPM).

Apple's compliance wouldn't reduce the security of their system -- it would only be exploiting a vulnerability that is already there, (which isn't present in solutions that don't use TPM-like solutions, (like Bitlocker without the TPM, Luks, the old TrueCrypt, Veracrypt, etc).

The strength of the device's encryption isn't in its hardware -- and shouldn't be -- but in the AES encryption algorithm itself.

Security vulnerabilities increase proportionately as more complexity is added upon the underlying AES cryptographic system, (more points of attack).

Since Apple is storing a secondary passkey in their ROM, (Salt, etc.) -- they have created a security vulnerability far greater than relying on AES -- themselves.

This vulnerability was intentionally added to the iOS, and similar TPM solutions -- for convenience -- to allow end-users to enter simple pass-phrases.

Answer 2 - But, the FBI has Alternative Methods:

The FBI's response: clearly indicates that they are already aware of alternate remedies, but hoping to shift their burden of due process upon private citizen's and corporations -- whether or not that is ethical or constitutional.

The FBI's complaint isn't the use of AES 256, or even to have access to the underlying data -- but they complain of the lack of an interface that would allow them to exploit an already existent vulnerability in the iPhone -- a brute force attacks against the user's passcode.

The FBI is in essence trying to hack Apple -- either as a legal business entity, (legal/social engineering) -- or by creating a Brute Force interface to exploit the iOS' security vulnerability.

Asking for an interface to hack the user's passcode reduces the level of complexity by --many, many--orders of magnitude.

The obvious alternate remedy is AES brute force attack against the encrypted storage -- using tools which already exist, (AES Hacking Links, and other potential attack also exist, Stack Exchange Link).

This shows that the FBI's request is simply vexatious -- political, because they are already aware of alternate remedies to hack/access the data -- and they know it.

The FBI is simply looking for a means that shifts their burden of due process, and the complexity, to private citizens and corporations.

If the request was in good faith -- then the order would have been limited to Section 6 of the Court Order, (maintaining the integrity of the data) -- which would allow the FBI to use alternate remedies -- without fear of losing the data.

Summary:

The use of TPM type hardware devices introduces cryptographic vulnerabilities upon AES that normally are not present when user's provide strong keys for AES directly.

Bitlocker, (without a TPM), Luks and similar systems rely on user's to manually enter a strong AES key. Those designs don't limit the user to simple pass-phrases -- which could then be used in turn to access the underlying encryption key.

The question of law is -- whether or not a Judge will rule that hacking AES is a viable "alternative remedy" under the Act, or any of the other potential remedies;

If the Court rules that the AES attack, (or any of the others), are not viable alternate remedies, then the order will probably stand.

And then, Apple will simply re-engineer their devices to ensure this won't happen again -- which they should do anyway.

elika kohen
  • 292
  • 1
  • 9
  • Comments are not for extended discussion; this conversation has been [moved to chat](http://chat.stackexchange.com/rooms/35954/discussion-on-answer-by-elika-kohen-apples-open-letter-they-cant-or-wont-ba). – Rory Alsop Feb 19 '16 at 07:24
-6

Apple wouldn't have to install the "cracked" version of iOS on all iPhones, so it wouldn't harm the security of all devices, although that is what the media reports seem to be implying.

The other option would be for Apple to extract the encrypted contents of the flash memory and then brute-force the decryption algorithm in an emulated environment.

Micheal Johnson
  • 1,746
  • 1
  • 10
  • 14
  • 3
    However, once the precedent has been established that Apple **can** be legally compelled to create an encryption-crippled iOS (or code to the same effect) in *this* one case, that precedent can be used to compel the same thing in ***EVERY*** future case. This is at least as much a fight over what Apple can be legally compelled to do as what it is technically capable of doing. – HopelessN00b Feb 19 '16 at 04:26
  • 2
    @HopelessN00b - your statement may be correct, but that doesn't make Michael's answer incorrect. He is talking about what they *can* do whereas you are talking about why they shouldn't do it. (You both can be correct.) I'm just pointing this out because I don't want Michael to think his answer was DVed because of this. I suspect most of the DVs were instead because both of these points have already been made in other answers. – TTT Feb 19 '16 at 05:23