2

I have an security protocol whose implementation will be done by many third parties developers (lets call them as 'manufacturers') which in turn will be programmed in the embedded hardware designed by them.
There are some mandatory guidelines (e.g. a particular cryptographic operation must be carried out using cryptoprocesor but not using software library) which must be followed by them.

Now is there any way to check whether they have followed all guidelines of the protocol and certify their implementation if you have access to source and binary file of the implementation?

How can I ensure that the manufacturer has programmed only the certified implementation in all the hardware samples?

Edit 1:
As per my knowledge, a solution for the certification: The source code can be manually inspected to crosscheck compliance with each recommendation. Then the source code as well as binary obtained from it can be certified using the signature of certifying authority on the checksum of both files.
Correct me if there is any loophole in this certification process which can be exploited by a manufacturer.

Now if binary of the implemented protocol is programmed separately, we can check file-system and locate this binary, calculate its checksum and verify it against the certified binary. But if protocol is implemented inside another application, then we can't have have this certification, instead the final application which includes protocol implementation has to be certified. But this may complicate the certification process.

mk09
  • 21
  • 3
  • walk through all the code and ask them to burn it to hardware in your presence. –  May 22 '17 at 06:58
  • But when the hardware samples are already programmed and given to customers and I also have one of these samples, then is there any way? – mk09 May 22 '17 at 07:06
  • 1
    Maybe you could take a look at [formal verification](https://en.wikipedia.org/wiki/Formal_verification), but it's generally something quite hard to pull off. – Lery May 22 '17 at 10:58
  • no, then it may not be possible to verify. even if you try to test it again some test input output test vectors, third party with mal intentions may hard code those test vectors in hardware –  May 22 '17 at 11:01
  • if you can test your protocol for some input output test vectors, and you can generate these test vectors using your own written software implementation, then you may randomly generate the test vectors and test the embedded devices for these test vectors –  May 22 '17 at 11:03
  • Generic input/output testing is not very useful in cryptographic protocols. E.g. a statistical test can never prove that a PRNG is actually a cryptographically-secure PRNG. Statistics are good at catching unintended mistakes - and they are completely useless at identifying malicious intent. – tylo May 22 '17 at 13:56

2 Answers2

6

There is no cryptographic answer to your question, because by Rice's theorem we know that any non-trivial, semantic property of any program is undecidable. You can't write a program that is able to decide if two programs are functionally equivalent or follow any of your guidelines.

What you actually need is some kind of certification process, e.g. Common Criteria. In CC there are different Evaluation Assurance Levels, and you have to find which one fits your kind of software best. The entire certification process is a huge tradeoff between assurance, effort for design and effort for testing / verification.

tylo
  • 221
  • 1
  • 2
1

In your case you could covertly obtain a random selection of the finished devices, then remove and replace the secure cryptoprocessor with some clever signal generator that outputs a known pattern instead of a random one (example instead of generating a random number it could output all 1s, inverting instead of encrypting whatever it was meant to encrypt).

If your concern is they are being lazy and not implementing the cryptoprocessor then it may be enough to disable the chip in one device and see it still working to show you that they took a shortcut. If you are worried about some spy world backdoor business and you don't trust this third party then it might be better to do the manufacturing in house.

daniel
  • 774
  • 3
  • 12
  • Since the application software for our hardware needs an OS, it will be programmed in a flash memory located outside the SOC. SOC contains all cryptographic hardware and OTP ROM. So if we replace the SOC from any randomly selected sample from a manufacturer, we can test conditions like whether he is hardcoding some values which are meant to be read from OTP ROM. But I'm not sure whether we can disable cryptographic hardware somehow. – mk09 May 23 '17 at 11:35