11

I'm working on a product that includes 3rd party drivers for some of the product's hardware. Some of the drivers are not signed, others are only signed with sha1 certificates.

Given that getting new, sha2 signed (or dual signed) drivers from all the 3rd parties is going to be very difficult, if not impossible, should I sign the drivers I have with my company's code-signing certificate?

What are the implications, both technical and legal, in signing someone else code? Does it confer more than just "we've tested this version of the driver and trust it"?

Mike Ounsworth
  • 57,707
  • 21
  • 150
  • 207
Grhm
  • 213
  • 1
  • 7
  • related: [Internal code-signing certificate on Windows](https://security.stackexchange.com/q/51659/71460) – SEJPM Apr 04 '16 at 20:10
  • @SEJPM how's that related. I'm talking about using an external, trusted CA provided cert to sign drivers that are part of product we sell. Its not internal, and we use it successfully to sign drivers we've written for our custom hardware. – Grhm Apr 04 '16 at 20:16
  • the last three paragraphs of the answer given in that question seem relevant (a) Windows may break and b) you may get your certificate revoked if any signed file leaves your network – SEJPM Apr 04 '16 at 20:22

1 Answers1

16

When you sign code, it means that you vouch for that code. When people decide to trust your signature, it means that they trust you to only sign code after you verified that it is trustworthy.

When you abuse that trust and someone finds malware with your signature on it, you can expect them to never trust your signature again. When your certificate has a special trust status by any operating systems or distribution platforms, you can expect that this trust will likely be revoked if it comes to the attention of the maintainers that you signed malware with it.

So if you decide to sign 3rd party device drivers, you should only do so when you feel you can wholeheartedly take responsibility for whatever that driver does. If you can not do that with a clear conscience, it's better to leave it unsigned and let the user decide if they trust it. Or better, find a more trustworthy alternative.

Philipp
  • 48,867
  • 8
  • 127
  • 157
  • 2
    In summary: "never being trusted again" vs "time and effort required to track down 3rd parties". – Pharap Apr 05 '16 at 00:42
  • 1
    @Pharap : ... vs. time and effort and expense of simply creating another signature? – TOOGAM Apr 05 '16 at 02:17
  • 3
    @TOOGAM: ...and building up customer trust again from scratch, with a black mark against you already. – Wildcard Apr 05 '16 at 02:34
  • @Wildcard : Actually, I didn't mean to make a new sig for the core products that must be trusted. I meant using another sig for less authenticated products. Then that sig could be revoked relatively painlessly, if the need arose. I generally take a sig as a statement that code has not been tampered with since an organization published it, but not necessarily that the organization personally debugged thoroughly enough to verify the purpose and functionality of every single byte. I do suspect this may not align with a consensus (of sig's implications) commonly held by some circles. – TOOGAM Apr 05 '16 at 06:25
  • 6
    @TOOGAM Remember that it is not the trust in your certificate which got compromised, but the trust in the QA processes of your organization. Just replacing the certificate won't do, as people will now distrust your organization. You will not convince people you won't do this ever again simply by changing your certificate. – Philipp Apr 05 '16 at 07:46
  • 1
    @Philipp: That's helpful thanks. None of this suggests any legal ramifications or precedents, just potential damage to reputation. – Grhm Apr 05 '16 at 11:12