6

Simple question: Both MD5 and SHA1 are susceptible to a collision attack. It's reasonable to assume that SHA256, the next algorithm we're switching to, also has such an issue, but one that's hidden because of the requirements to find such a collision.

The thing is, why don't we use multiple algorithms to verify file integrity? like, calculate multiple checksums using multiple algorithms for the same file and only declare it acceptable if all of them match? I mean, finding a collision for MD5 right now is doable on smartphones, and finding one for SHA1 has been proven feasible with the SHAttered attack. however, if you had to find a collision for both MD5 AND SHA1, wouldn't that increase the time needed as well?


Clarification: While this particular suggestion may actually be in use in places, what I'm talking about is: why isn't this technique commonly proposed as an alternative to upgrading to SHA256?

Nzall
  • 7,313
  • 6
  • 29
  • 45
  • do you have proof that it is not being used anywhere in the world? – Purefan Feb 24 '17 at 15:37
  • You can do it, anyone can do it. You can even implement MD5SHA1 sum that concatenates both MD5 and SHA1. – ThoriumBR Feb 24 '17 at 15:38
  • 1
    @Purefan sorry, I had to make myself clear. I clarified what I meant. I wasn't talking about "why aren't we using it in the wild?" but about "why isn't this suggested as an alternative to a stronger hashing function?" – Nzall Feb 24 '17 at 15:41
  • 1
    The edit makes http://security.stackexchange.com/questions/83881/is-using-the-concatenation-of-multiple-hash-algorithms-more-secure relevant too – Matthew Feb 24 '17 at 15:41
  • And http://security.stackexchange.com/q/33531/16960 – Xiong Chiamiov Feb 24 '17 at 16:56
  • This is already used. For example, Debian provides multiple checksums for file integrity, see http://ftp.nl.debian.org/debian/dists/jessie/main/installer-amd64/current/images/ – Jesse K Feb 24 '17 at 18:15
  • @JesseKeilson work which is undone by serving them over plain HTTP, so that could be checksums of hacked binaries coming from an imposter server. – TessellatingHeckler Feb 24 '17 at 23:32
  • @TessellatingHeckler Other mirrors and sites host over https - this was merely the first one I happened upon. – Jesse K Feb 27 '17 at 18:28
  • One could then use the weak algorithms to build a "mask" before trying to look for collisions on stronger algorithms, so weak algorithms may decrease the stronger ones' efficiency. – Xenos Apr 07 '17 at 15:08

2 Answers2

3

Using multiple hash functions is simply the same as defining a new hash function, only much slower than a "better" secure hash and with little justification for the design.

So while it might indeed be harder to find a collision in your ad-hoc H(x)=MD5(x)||SHA1(x) than finding a collision in the components, it is far less efficient and likely less secure than purpose-built functions like SHA2-512 or SHA3-512. Even SHA2-256 has shown no weakness in the last decade of cryptanalysis.

rmalayter
  • 211
  • 1
  • 4
-1

Applying chained hash functions into a plaintext does not necessarily makes something 'more secure'.

If we could answer whether applying MD5 to SHA1 (or the opposite) weakens the resulting hash (by reducing entropy, for example), then the recommendation would make sense.

Filipe Rodrigues
  • 398
  • 3
  • 13
  • I'm not talking about "chained hashes", I'm talking about "hash the original file with SHA1, hash the original file with MD5, only accept the file if both hashes match with what's being passed." – Nzall Mar 31 '17 at 07:19