27

When working with Internet of Things devices, is it recommend to obfuscate or encrypt firmware images pushed to clients? This to make reverse engineering harder.

(They should be signed of course)

schroeder
  • 123,438
  • 55
  • 284
  • 319
VC_work
  • 481
  • 4
  • 7
  • 26
    what does 'reverse engineering' have to do with security? – schroeder Sep 01 '17 at 13:30
  • 1
    let's make this simple: encryption is a tool to mitigate risks - if the risks you have in IoT firmware can be mitigated by encryption, then yes, encryption makes sense – schroeder Sep 01 '17 at 13:34
  • Firmware is software that is embedded in a piece of hardware. If you transmit using a public network this transfer should be encrypted. – LeonanCarvalho Sep 01 '17 at 13:40
  • 33
    @LeonanCarvalho - there is no need to encrypt firmware for transmission, only to sign and verify it (which is not encryption.) It doesn't matter if someone knows what the firmware contains, it only matters if someone can tamper with the firmware that gets installed. – AJ Henderson Sep 01 '17 at 14:53
  • 4
    Does the firmware need to be decrypted to run on the device? If so, do customers get the keys in a way that malicious users (who can also be customers) can not? – Andrew Sep 01 '17 at 18:01
  • 22
    "We have made some improvements to the firmware of your internet-connected infrared bedroom camera. The new firmware is encrypted, but we promise that its only intent is to do something harmless" – Hagen von Eitzen Sep 01 '17 at 20:19
  • It's interesting finding this question as I was just discussing this in class recently. Here is a source I used to make my stance in the DQ (Discussion Question): https://www.symantec.com/content/dam/symantec/docs/white-papers/iot-security-reference-architecture-en.pdf – David Sechrest Sep 02 '17 at 01:06
  • 1
    @schroeder, a couple things (1) the design may itself need securing, e.g. for IP reasons (2) it could fall under the "discovery" security concern, where the knowledge of the design could make actual explots possible. – Paul Draper Sep 03 '17 at 21:48

8 Answers8

84

No. You should not rely upon the obscurity of your firmware in order to hide potential security vulnerabilities that exist regardless of whether or not you encrypt/obfuscate your firmware.

I have a radical suggestion: do the exact opposite. Make your firmware binaries publicly available and downloadable, freely accessible to anyone who wants them. Add a page on your site with details on how to contact you about security issues. Engage with the security community to improve the security of your product.

Polynomial
  • 132,208
  • 43
  • 298
  • 379
  • 28
    Even better - provide a (large) prize for security vulnerabilities, with more severe vulnerabilities (measured in _maximum_ damage they can cause) earning a larger prize. This will hopefully convince some wannabe crackers to disclose the details (certain prize) as opposed to trying to exploit them (uncertain, _possibly_ greater but probably smaller prize, plus risk of being caught). – wizzwizz4 Sep 01 '17 at 20:53
  • 10
    I would even encourage to make the firmware [free software](https://en.wikipedia.org/wiki/Free_software) or [open source](https://opensource.org/). – Basile Starynkevitch Sep 02 '17 at 07:06
  • 6
    @wizzwizz4 Many businesses do not have the capital to operate a fully-fledged bug bounty program, particularly SMEs and self-funded startups. In general I've seen these businesses offer alternative recompense, such as free or heavily discounted products, t-shirts, swag, etc. depending on the nature of the business and brand. Bug bounties do not necessarily need to be formal and costly. – Polynomial Sep 02 '17 at 10:53
  • 8
    @Polynomial Many businesses don't have the capital to _not_ operate a bug bounty program. If a large breach is found, surely it's worth that $x for it not to be found when your clients are suing you for damages caused by a breach. – wizzwizz4 Sep 02 '17 at 11:33
  • 7
    @wizzwizz4 I doubt that the lack of a bug bounty program would ever be the confounding factor in a breach. Bounties are great, but they're an optional layer in almost any business' defence budget, perhaps with the exception of highly public organisations. If the goal is to avoid a breach, the cost and time involved with running a BBP can be better invested in more direct defence measures such as staff training and improving blue team capabilities. BBPs are great if you've already got everything else covered and have spare budget, or if the perception of security markets well in your vertical. – Polynomial Sep 02 '17 at 12:20
  • 1
    @BasileStarynkevitch's suggestion isn't as far-out as it might seem. Presumably, the firmware is of little non-academic use to someone who doesn't have the device in question. The company is in the business of selling physical gadgets and maybe services, but not really software. Under such circumstances, releasing the software for everyone to look at might very well be at worst neutral, and possibly give a PR boost. Bonus points if there's actually a way for an ordinary (but technically competent) user to load modified firmware onto the device. – user Sep 02 '17 at 23:02
  • 1
    Presumably the firmware is helpful to their competitors. – user253751 Sep 03 '17 at 23:01
  • Also this is best for the world overall, but not necessarily for the company publishing the firmware. (Either way the vulnerabilities will be found and the company will suffer; do they want to suffer sooner because they were easier to find?) – user253751 Sep 03 '17 at 23:02
  • @immibis If a vulnerability is discovered separately and not responsibly disclosed to them, they have the PR high ground. The alternative is delaying the inevitable and making it a worse outcome. – Polynomial Sep 03 '17 at 23:04
  • 1
    @Polynomial Perhaps among security-aware people. The general public still sees "XYZ product hacked". – user253751 Sep 03 '17 at 23:23
  • Security researchers will not have a look at a local IOT vendor making some crappy firmware. Why do people audit firmware? Fame, fortune? What kind of visibility will you gain by finding a bug in poorly written firmware by a small local company? How much bounty should you give? – Silver Sep 07 '17 at 07:59
  • @Silver I am a security researcher and I do look at IoT vendor firmware all the time. I don't expect a bounty. I do it for fun. – Polynomial Sep 07 '17 at 13:08
11

Doubtful it would be beneficial. It is by far a better option to push it open-source than closed source. It might seem silly and even controversial at first, but opening up a project to the public has plenty of benefits.

While there are people with malicious intents, there are also people wanting to help and make the internet a better place. Open source allows more eyes to look over the project, not only to view about potential features, bugs, and issues but also increase security and stability of the "thing"

And to agree with Polynomial's answer, engaging in a community and building a base of people that help you out with security, will increase the client base by a significant margin.

Josh Ross
  • 663
  • 3
  • 10
  • 8
    Open source may be a step too far for a commercial product, and the "many eyes" principle is generally flawed (see: the state of OpenSSL before Heartbleed). That said, if the project uses any GPL licensed code without special agreement from the author then the full source must be made available anyway. – Polynomial Sep 01 '17 at 14:30
  • 1
    @Polynomial If the project uses any GPL licensed code then *the parts that directly touch the GPL code* must be made available as source code. Not all of it. Mere aggregation still applies. – user Sep 02 '17 at 23:05
  • I can see how that might be an issue with fully Open Source. Also, it usually comes down to semantics when it comes to Open Sourcing on GPL code. I'll look into the OpenSSL matter as well, thanks for that! – Josh Ross Sep 04 '17 at 05:24
  • I really think a good answer should address the issue of "if my software has to be proprietary, is it worth encrypting/obfuscating". There may be plenty of reasons to keep software proprietary that have nothing to do with security (contracts with customers, red tape by management, valuable algorithms). I also believe that publishing your source code isn't a good way to fix up an insecure software development process and shouldn't be overemphasized by dedicating the entire answer to it. – Cody P Sep 06 '17 at 21:36
6

A well-designed firmware should rely on the strength of its access key rather than relying on the attacker's ignorance of the system design. This follows the foundational security engineering principle known as Kerckhoffs's axiom:

An information system should be secure even if everything about the system, except the system's key, is public knowledge.

The American mathematician Claude Shannon recommended starting from the assumption that "the enemy knows the system", i.e., "one ought to design systems under the assumption that the enemy will immediately gain full familiarity with them".

You may be interested to know that prior to the late Nineteenth Century, security engineers often advocated obscurity and secrecy as valid means of securing information. However, these knowledge-antagonistic approaches are antithetical to several software engineering design principles — especially modularity.

Mavaddat Javid
  • 288
  • 1
  • 7
6

Some people argue that code which is open source can be audited by many and therefor contains little bugs. On the other hand, attackers have the same easy access and also look for these same vulnerabilities. There is definitely a tradeoff here which is not correctly described in previous answers.

Others mention that code should be inherently secure and therefor requires no obfuscation/encryption/hiding. It is true that a system should be designed to be secure even if you know how it works. This doesn't mean that this is always the case AND the implementation is flawless. In practice, code is never 100% secure. (Take a look at web app security: Why do we need security headers to protect us against XSS and CSRF attacks if there are no vulnerabilities in the web application?) Additional security measures can be taken by trying to hide the code through encryption and obfuscation. In the mobile world, reverse engineering is even seen as a serious risk: OWASP Mobile Top 10 risks.

As no system is 100% secure we can only try to increase the effort required to break it.

So now, the tradeoff between open source/easily available code VS encrypted and obfuscated code. Allowing public review on you source code can help reduce the number of bugs. However, if you are a small company where the public has little incentive to freely audit your code, there is no benefit from publishing your code as nobody will look at it with good intentions. However, it becomes much easier for attackers to discover vulnerabilities. (We are not talking about the newest iOS version which every security researcher is trying to crack)..

In this case we aren't even talking about open sourcing the code for public review. We are talking about encrypting the firmware in transit. Security researchers are not likely going to buy your device to obtain the code to discover and publish vulnerabilities. Therefor the chance of having the good guys finding the vulnerabilities VS the bad guys finding them decreases.

Silver
  • 1,824
  • 11
  • 23
2

Are you sure you are not confusing two cryptographic methods?

You should certainly sign your firmware updates for security reasons. This allows the device to verify they are from you.

To encrypt them adds a little bit of obscurity and that's it. Since the decrypting device is not under you control, someone sooner or later will hack it, extract the decryption key and publish it on the Internet.

Tom
  • 10,124
  • 18
  • 51
  • You could always make it illegal to distribute the decryption key! Making numbers illegal definitely works... – wizzwizz4 Sep 02 '17 at 11:51
0

You need to ask yourself this question. Is someone clever enough and interested enough to download your firmware and start looking for vulnerabilities going to be deterred by an additional firmware encryption layer where the key must be revealed?

This is just another hoop to jump through, no different than figuring out what disk format your firmware image is in. This isn't even a particularly difficult hoop to jump through. Keep in mind that all FAR more sophisticated methods of what amounts to DRM have all been broken.

Odds are someone determined enough to hack your internet connected coffee maker/Dishwasher isn't going to be deterred by an additional encryption layer.

Steve Sether
  • 21,480
  • 8
  • 50
  • 76
0

In the sense of "will the encryption of firmware prevent the detection of vulnerabilities in my code?" other answers have addressed the core of it: although it may discourage some attackers, security through obscurity leads to a false feeling of invulnerability that is counterproductive.

However, I'd like to add an extra bit on the basis of my experience. I have seen that the firmware packages are sometimes encrypted, but the motivation for that is only to preserve a company's intelectual property, rather than be a control against attackers.

Of course, hackers often find ways around this "control", but that's a different story.

-4

Several people said you shouldn't rely on obfuscating the code by encrypting it, but to just make it secure. Quite recently the encryption of some rather critical software in Apple's iPhones was cracked (meaning that hackers can now see the actual code, no more). It stopped anyone from examining the code for three years, so the time from release to first crack has been increased by three years. That seems like some very successful obfuscation.

And encryption goes very well together with code signing. So when your device is given new firmware, it can reject any fake firmware. Now that part isn't just recommended, that is absolutely essential.

gnasher729
  • 1,823
  • 10
  • 14
  • 2
    Encryption and signing are orthogonal. Both may have value, but they solve different problems and do so separately. – user Sep 02 '17 at 23:10
  • Both are very hard if you don't use some crypto library properly, and easy if you do. If you do one, doing the other adds very little effort. And both solve the same problem: Helping to keep the hardware safe. – gnasher729 Sep 02 '17 at 23:43
  • What you are missing is that the encryption in the anecdote you mention *is only even relevant* because the code is encrypted everywhere but inside the trusted execution unit. For an ordinary processor, even if the update is encrypted in flight or in storage, it will have to be decrypted *by something stored on the device* to execute. It's fairly hard to make it so that someone in possession of the a copy of the hardware can't get at the decrypted code, and only in the unusual situations where that is done effectively does encrypting the update in flight matter. – Chris Stratton Sep 03 '17 at 19:28
  • @gnasher729 Actually, the longer to first hack, the worse. Three years later, it's much harder to fix the bug -- the manufacturer may not even still be around and who knows if they still have the tooling. And three years later, the device might be forgotten about and unmaintained. Three years later, there will be more devices in the field.The sooner flaws are found, the sooner (and more likely) that they're fixed. (And how do you know that one really bad guy didn't know about the flaw and had three extra years to exploit it?) – David Schwartz Sep 04 '17 at 04:23