1

Just wanted to know if this was an accurate way of determining how long it would take someone with an 8GB GPU to derive 1 million keys using some assumed parameters below:

Time per derivation...........: 3.5 seconds
Memory required per derivation: 128MB
GPU Max Memory................: 8GB

So, 128MB = 0.125GB. Max derivations that could be happening at any given time = 8/0.125 = 64.

Each derivation takes 3.5 seconds, so derivations per second = 64/3.5 = ~18.3

So, 1 million key derivations would take (1000000/18.3)/3600 = ~15 hours.

Does this seem accurate, or am I unknowingly making too many assumptions? Thanks.

1 Answers1

1

You've assumed that the derivations can happen in parallel, which might not be possible, but your calculation will give you an answer to the correct order.

LTPCGO
  • 965
  • 1
  • 5
  • 22