67

I’ve heard about a rule in Information Security, that once a hacker has access to your physical machine, then it’s all over. However, there seems to be a big exception to this rule: iPhones.

It was all over the news a while back that the CIA (or the FBI or something) could not access information from a terrorist’s phone for their counter-terrorism ops. They had to ask Apple to create them an unlocking program that could unlock the phone for them.

My question is, why are iPhones so hard to hack?

André Borie
  • 12,706
  • 3
  • 39
  • 76
Melkor
  • 1,285
  • 2
  • 10
  • 12
  • 89
    As an aside, somebody did manage to break into the iPhone. –  Apr 19 '17 at 08:36
  • 17
    Compared to a conventional computer all the parts that handle crypto were on the same chip, so it's a lot harder (pretty much impossible without destroying the chip) to tap into the data lines to read memory or capture encryption keys. – André Borie Apr 19 '17 at 10:50
  • 4
    As I recall, the main thing making the iPhone hard to hack was a hardware mechanism that wiped the memory after 10 unsuccessful password attempts. That made it difficult to brute force the password and unlock the encryption, which would be trivial otherwise (short passwords). The FBI wanted Apple's help in bypassing it, but even that could only be done with physical access to the device. And they did eventually succeed without Apple's help. – Seth R Apr 19 '17 at 14:28
  • 6
    @SethR that's the problem with having everything on a single chip. On a standard computer you could get access to the data lines between the RAM and CPU to edit memory at will and reset the retry counter, but good luck doing that when all of it is in the same silicon. – André Borie Apr 19 '17 at 14:41
  • 13
    Just because the FBI requests Apple's assistance in unlocking an iPhone doesn't mean an iPhone is harder to crack than anything else (they could be incompetent), nor does it even mean the FBI can't crack it (there could be many reasons to ask anyway, including a desire to hide their competence). – Todd Wilcox Apr 19 '17 at 21:41
  • Because Apple deliberately designed it to be that way – user253751 Apr 19 '17 at 22:31
  • iPhones probably uses a Cryptography technique that cannot be encrypted and/or decrypted without the passphrase. – Nightwolf Apr 20 '17 at 09:21
  • @SethR: The hardware mechanism didn't include the anti-brute force logic, just signature checking on the boot software, which prevented removal of the anti-brute force function (although it was in software) unless Apple counter-signed the modified software. – Ben Voigt Apr 22 '17 at 22:09

5 Answers5

82

I don't think that you interpret the rule you've heard in the right way. If an attacker has physical access to an encrypted but switched off device he cannot simply break the encryption provided that the encryption was done properly. This is true for an iPhone as much as it is for an fully encrypted notebook or an encrypted Android phone.

The situation is different if the device is not switched off, i.e. the system is on and the operating system has access to the encrypted data because the encryption key was entered at startup. In this case the attacker might try to use an exploit to let the system provide him with the decrypted data. Such exploits are actually more common on Android mainly because you have many vendors and a broad range of cheap and expensive devices on this system vs. only few models and a tightly controlled environment with iPhones. But such exploits exist for iPhone too.

With physical access it would also be possible to manipulate the device in a stealthy way in the hope that the owner does not realize that the device was manipulated and enters the passphrase which protected the device. Such manipulations might be software or hardware based keyloggers or maybe some transparent overlay over the touchscreen which captures the data or similar modifications. This can be done both for switched off and switched on devices but a successful attack requires that the owner is unaware of the changes and will thus enter the secret data into the device. Such attack is also often called evil maid attack since it could for example be done by the maid if one leaves the device in the hotel room.

Steffen Ullrich
  • 184,332
  • 29
  • 363
  • 424
  • 12
    In a case OP is referring, FBI was unable to access the iPhone without Apple because firmware was designed in such way that brute force attack on a 4 digit PIN is not possible - the process is slow and it would wipe the phone after several failed attempts. Is there such a protection in Android phones and if there is, how easy it is to bypass it compared to an iPhone? Said iPhone was successfully accessed by a security firm exploiting a flaw that was patched in more recent versions but that is not relevant for the question. – Marko Vodopija Apr 19 '17 at 08:53
  • 17
    @MarkoVodopija: see [What if the FBI tried to crack an Android phone? We attacked one to find out](http://theconversation.com/what-if-the-fbi-tried-to-crack-an-android-phone-we-attacked-one-to-find-out-56556) for a more detailed analysis of this specific case, comparing iOS to Android 5.1.1 on a Nexus 4. But, different devices and vendors might give different results and there might also be changes in more recent versions of Android. But I would consider the discussion of more details about this specific case off-topic on this question because the OP essentially asked a more general question. – Steffen Ullrich Apr 19 '17 at 09:06
  • Thanks for the link. I think you should include it in your answer, it is a very good read and follows the spirit of the question quite nicely. – Marko Vodopija Apr 19 '17 at 09:29
  • 17
    I believe it would still be possible to de-lid the chip and directly read the encryption key, or otherwise modify the hardware, but this is a high-risk option. They wanted to exhaust all other options before resorting to this. – OrangeDog Apr 19 '17 at 10:59
  • 8
    @OrangeDog I'm pretty sure the chip is designed to wipe its keys if you delid it (I can't seem to find a link confirming it, but I'm pretty sure that's why the justice system was leaning on Apple so heavily). Apple purposefully built a system that was very hard to bypass physically, to the point where anything they tried to do could have resulted in a total wipe of the phone or its keys. – phyrfox Apr 19 '17 at 13:07
  • 6
    @phyrfox It's not specifically designed that way, it's just the case that any attempt to de-lid any chip will risk destroying it. They were leaning on Apple heavily because that's the cheapest and most guaranteed to work option (get an official firmware update that removes the brute-force protection). – OrangeDog Apr 19 '17 at 13:10
  • 2
    @phyrfox: While Apple is trying hard there is no perfect security, even at the hardware level. While the chips might be tamper resistant there might be other attacks possible, like described in [iPhone passcode bypassed with NAND mirroring attack](https://arstechnica.co.uk/security/2016/09/iphone-5c-nand-mirroring-passcode-attack/). – Steffen Ullrich Apr 19 '17 at 13:12
  • 1
    @SteffenUllrich Yeah, I didn't mean to imply that it WOULD result in key loss, just that it COULD wipe itself if certain typical actions were taken to compromise it (tamper-response). But, apparently I was wrong anyways. That's an interesting link, thanks for sharing! – phyrfox Apr 19 '17 at 13:15
  • 4
    The entropy for a convenient PIN or fingerprint on a mobile phone is too low to protect against offline attacks without the use of carefully designed hardware such as the secure enclave. Laptops generally have no such equivalent (although they can use a less convenient passcode) and I don't believe Android requires a security enclave equivalent yet... even so, quality of the implementation varies: http://bgr.com/2016/07/04/android-full-disk-encryption-hacked/ Apple's implementation has proven much more durable. – mgjk Apr 19 '17 at 14:09
  • 2
    And it should be noted that the authorities claim to have succeeded in the end anyway. – Lightness Races in Orbit Apr 19 '17 at 16:30
  • 1
    The FBI also cannot break into a desktop/laptop that is powered off using TrueCrypt. https://yro.slashdot.org/story/10/06/26/1825204/fbi-failed-to-break-encryption-of-hard-drives – Chloe Apr 20 '17 at 15:34
  • An unpowered chip has no means of actively reacting to delidding unless you take advantage of photoelectric effects (either solar power the wiping system - huge waste of expensive-process silicon! ... or short out storage cells with a photocell ... but does such a design even exist?) or keep a substantial capacitive or chemical energy store inseparable from the chip (process cost...) or rely on the chip never being unpowered lest it wipe the key (done that way in professional crypto, probably a reliability concern in a consumer device!).... – rackandboneman Apr 20 '17 at 22:45
  • @rackandboneman: There is no need to actively react to detect tampering. The chip can rely on specific physical properties which get destroyed when tampering with the system. Have a look at [physically unclonable functions (PUF)](https://en.wikipedia.org/wiki/Physical_unclonable_function) to get the idea. – Steffen Ullrich Apr 21 '17 at 04:39
25

The rule you are referring to goes back to Scott Culp and is from this essay he wrote in 2000:

https://technet.microsoft.com/library/cc722487.aspx

In 2000, there was no such thing as an iPhone. Moreover, the "10 Immutable Laws of Security" are meant as guidelines, aphoristic memory jogs, and (despite the name) not really as laws. They are also outlined in more detail in the essay, so go and read the part about the hardware law:

https://technet.microsoft.com/library/cc722487.aspx#EIAA

excerpt:

If you travel with a laptop, it's absolutely critical that you protect it. The same features that make laptops great to travel with – small size, light weight, and so forth—also make them easy to steal. There are a variety of locks and alarms available for laptops, and some models let you remove the hard drive and carry it with you. You also can use features like the Encrypting File System in Microsoft Windows® 2000 to mitigate the damage if someone succeeded in stealing the computer. But the only way you can know with 100% certainty that your data is safe and the hardware hasn't been tampered with is to keep the laptop on your person at all times while traveling.

Laptops being the closest things to iPhones in 2000, he is talking about basically the same thing that iPhones actually do (encrypt data, have anti-tampering protocols, etc.) but his conclusion is still correct today: If the president's iPhone with the nuclear launch codes on it went missing, I would very strongly urge changing the launch codes immediately. The attacker may or may not be able to break into the iPhone, but once he has physical access, you cannot be 100% sure.

So, why?

Your main question was why iPhones are hard to break and there are two answers. The technical answer is that they use strong encryption with hardware components (the "secure enclave") to encrypt the device and prevent unauthorized access. The non-technical answer is that they have both the resources and the interest to throw enough money at the problem and do it right. Unlike Google or Facebook, they are not in the business of selling user data, so they do not have to leave backdoors or access ways for their own purposes. A good security reputation is worth considerable revenue.

The simple fact that Apple controls both the hardware and the software also makes it considerably easier to implement proper security.

psmears
  • 900
  • 7
  • 9
Tom
  • 10,124
  • 18
  • 51
  • 14
    `so they do not have to leave backdoors or access ways for their own purposes` - the fact that you can call apple to disable your iphone when it gets stolen proves the contrary. – grochmal Apr 19 '17 at 13:25
  • 17
    If you are refering to the "Find my iPhone" functionality - that is an advertised feature of the device, so while it might open up security risks, it is not a backdoor and more importantly, is not for the benefit of Apple. – Tom Apr 19 '17 at 14:18
  • 10
    But that still counts as an "access way for their own purposes", right? Nothing against the feature, it is just a sentence that I believe is not quite correct (and masks a technicality with some marketing). – grochmal Apr 20 '17 at 02:05
  • 2
    Depends how you define it... Considering you need to go online and log in to your icloud account with your password, I'd say thats more of a secured front door. I'm not aware of a mechanism that would allow Apple to simply lock your phone via phonecall. – Vitalydotn Apr 20 '17 at 20:07
  • 1
    If an iOS device is managed by an MDM solution, then the MDM can issue a command to the device that clears the passcode remotely. Again this isn't a "secret back door" as such, since the device owner is aware that their device is being managed. What I do find interesting about the case in the question is that the phone was owned by the employer, a government department, not the individual. If the employer had followed best practice and used an MDM to manage the device they could have removed the passcode easily. – Paulw11 Apr 20 '17 at 20:56
  • A C. 2000 laptop would have had a removable hard drive. If someone got their hands on the laptop for long enough the could remove and clone the drive for an offline attack on the encryption. Of course this a strong key and flawless encryption help but can't be assumed. Installing a keylogger to capture the encryption key would also be possible despite the lack of space, but would probably require getting the laptop back again (this was 2000; people wouldn't necessarily be online while travelling - 802.11b was new so you'd be looking for a wired connection). – Chris H Apr 21 '17 at 10:43
  • @ChrisH: laptops c. 2000 wouldn't have Secure Boot. It would not be too difficult to install a key logger on the necessarily unencrypted and unsigned boot program. As soon as the user types in their decryption password the boot loader could save the password and inject a rootkit to the now-decrypted main system. The root kit could send out the decryption password later when there's any internet connection, and then wipe itself and restore the boot loader, leaving no trace or it can keep spying on the user. – Lie Ryan Apr 21 '17 at 14:00
  • @LieRyan yes, that's one of many options. I was thinking of keylogging in hardware as I know machines from not much later had some degree of protection for the BIOS. – Chris H Apr 21 '17 at 15:28
5

This is a really broad question. But the information access thing is because the iPhone's data was encrypted. And if encryption is done properly it is very hard to break. Does not necessarily mean the iPhone itself is hard to hack.

Black Magic
  • 1,212
  • 1
  • 10
  • 15
  • 1
    Is there not any difference in the hardware secure element? I thought that at least some Android phones didn't have a secure element at all, so you could copy all the storage and extract the keys, which is not possible with a secure element that doesn't let you copy its contents. – cloudfeet Apr 19 '17 at 10:28
  • 1
    *Doesn't let you copy its contents without high risk of destroying it instead. – OrangeDog Apr 19 '17 at 11:00
  • what do you mean by the "iPhone itself is [not] hard to hack"? The encryption implementation is implemented in the phone. – Keith Loughnane Apr 19 '17 at 12:44
  • @KeithLoughnane I would guess that -- once past the PIN -- it's possible to jailbreak the phone.fairly easily. – TripeHound Apr 19 '17 at 13:03
  • @Keith Loughnane it is mostly about making the key available to the device all the time without making it available to anything else! – rackandboneman Apr 20 '17 at 22:47
  • @rackandboneman I understand that. What I don't understand is why that's not considered part of the "iPhone itself". It seems weird that some parts of the security implementation are part of the "iPhone itself" and other parts are not. Surely it's all part of one consistent security plan. "Does not necessarily mean the iPhone itself is hard to hack" is a statement that implies a work around and I was asking for clarity. – Keith Loughnane Apr 21 '17 at 07:41
2

One of the most important reason is that the iPhone data is correctly encrypted. Correctly means that the right algorithm was used and it was correctly implemented.

In particular, the key used to encrypt the data is long enough - in practical terms unbreakable.

On the other hand the user is authenticated with a 4 digits PIN - how can this not be brute forced?

The reason is that this PIN is used to recover the encryption key above, which is then used to decrypt the data. This extraction process must therefore be extremely protected. Apple does this by using a special chip (TPM) which gets a PIN request, checks the PIN and releases the key if the PIN is correct. It also makes sure to deny access for queries which are too numerous, increasing the answer time and ultimately (if configured) wiping the device. This is detailed in the IOS Security Guide.

This allows for a ridiculously short PIN to still be an appropriate indirect encryption key.

Other concerns have been addressed in other questions, particularly the accepted one which goes though other attack vectors.

WoJ
  • 8,957
  • 2
  • 32
  • 51
-4

The premise of the question is wrong. It assumes, that FBI actually failed to crack an iPhone, but there is no evidence for that (aside from statements of US officials).

More importantly, if FBI, CIA or your uncle could get access to all data on iPhone with just Apple's consent, terrorists could have done the same.

Take right Apple employees hostage, or hack into their computers, and you get access to any iPhone in the world.

That does not sound "hard to crack" at all.


EDIT:

Some people have expressed opinion, that this answer lacks sufficient technical details (not sure why, the question as asked does not mention "technical details"). To illustrate the point above, let's add some technical details.

Suppose, that you have encrypted an USB flash drive with some well-known encryption algorithm. By doing so you ensure (assuming that algorithm is secure), that nobody can access the information within without your personal consent (we are ignoring the non-technical details, remember?)

On other hand, if you encrypt your iPhone with same encryption algorithm, the people, who may give consent to decrypt it are: you and unspecified number of strangers (Apple employees and anyone who they share the key with). This makes for a weaker encryption scheme, since the "key" is forcibly shared amongst unspecified number of people and any of them can obtain your information without consent of others.

As such, based on technical consideration above, iPhone is (on average) less secure that an encrypted USB thumb stick.

user1643723
  • 101
  • 3
  • 2
    This answer pretty much just distils down to, 'social engineering', which doesn't really answer the question at all. –  Apr 21 '17 at 09:48
  • Question was about the technical details. – schroeder Apr 21 '17 at 10:24
  • @schroeder "Social engineering" is when the owner of encrypted information is tricked or threatened into revealing the decryption key. Existence of backdoor, that allows third party (Apple) to decrypt information, certainly sounds like significant "technical detail", somehow overlooked by the question author. – user1643723 Apr 21 '17 at 12:30
  • 1
    I think you are still missing the point. The question is about unauthorised access to the device (hack). Asking someone else to access the device for you is not a "hack". The question is about a threat actor's (FBI) ability to directly access the hardware *without* asking Apple to do it for them. This is specifically mentioned in the question. – schroeder Apr 21 '17 at 14:58
  • @schroeder There is no reason to believe, that FBI didn't decrypt the iPhone. It is perfectly sensible to simultaneously ask Apple for help and initiate decryption procedures. Also I'd like to remind you, that Apple's devices are officially sold as "whole", the inseparable combination of hardware and Apple's "know-how". As such I found it most natural to treat Apple's backdoors as inseparable part of any security scheme, implemented in iPhone. – user1643723 Apr 21 '17 at 15:09
  • 2
    You are speculating and creating boogiemen to make your point. Unfortunately, you are still missing the question. – schroeder Apr 21 '17 at 15:15