1

I was wondering if there was a calculator or formula I could use to find a rough estimation of the time it takes to crack hashes based on GPU. I am trying to assess how much performance I would lose/gain based on different build cases. Specifically, x3-4 RTX 2080's vs. x3-4 RTX 2060's.

I've found a few threads or websites about it but they seem more concerned with the actual algorithm and password length/complexity than anything to do with GPU performance (clock speed). Or they say just to run it in hashcat and find out but obviously that's impractical if I'm trying to assess which build case to go with for business justification purposes.

I'll be using Hashcat and really don't care about any variables other than the clock speed, so ideally we could make password length, complexity, space, hash type, attack type, etc. constants just so I can have a speed differential to compare GPU models/amounts.

Ryan
  • 11
  • 1
  • 2
  • This depends highly on the algorithm. You can always benchmark your system as well –  Jun 17 '19 at 16:31
  • Yes, so for simplicity sake let's say it's always MD5# – Ryan Jun 17 '19 at 19:14
  • Then it still depends on your environment. How busy is your cracking machine? Is it properly cooled? There are so many variables it's impossible to give a number like "These many cores at this speed will give you X hashes per second" - just like it's not possible to say that having a certain graphics card will guarantee you a certain FPS for a specific game. –  Jun 17 '19 at 19:19
  • Hmm. I was afraid of that. What do you recommend would be a better metric to justify one GPU over another to nontechnical personnel? – Ryan Jun 17 '19 at 19:36

1 Answers1

2

As was mentioned, getting exact number of hashes per second is impossible, but some benchmarks should give you an idea of scaling if the coolling and other variables are equal or adequate.

Now I would avoid mentioning precise numbers and keep in mind that while comparing NVIDIA RTX 2060 and 2080 sgould give you a decent idea, comparing NVIDIA and AMD or RTX vs Titan or even across generations can be widely off if you use different hash algorithm than the benchmark.

https://tutorials.technology/blog/08-Hashcat-GPU-benchmarking-table-Nvidia-and-amd.html

Peter Harmann
  • 7,728
  • 5
  • 20
  • 28