Note that the THC-Hydra statistics were made against a machine which is described as running SuSE Linux 7.2. In the history of the S.u.S.E / SuSE / SUSE / openSUSE Linux distributions, this brings us back to mid-2001, so this benchmark was probably done on a machine of that era. Technology has improved since, and computing speeds have grown.
A typical "weak remote web service" will use HTTP basic authentication, possibly within a SSL tunnel (i.e. that's HTTPS, not HTTP). Each password guess then entails the following: sending a HTTP header which contains the presumed password (say 200 bytes for the header) and then obtaining an error response (a 401 error code, with a header which will be probably less streamlined for size than what the attacker chooses; say 1 kB of header). Since HTTP supports keep-alive, the TCP connection (or SSL/TLS tunnel) can be immediately reused for another attempt; in particular, this means that there will be no noticeable overhead in the presence of SSL. 1000 attempts per second then requires a bandwidth of about 200 kB/s to the server, 1 MB/s from the server; that's a 10 Mbit/s bandwidth and even cheap hosting services offer that much. On local networks, 100 MBits/s or greater are the norm. No bandwidth problem here.
Of course, the attacker should try to parallelize in order to avoid latency. There is no need to wait for the server's answer before sending another request. Parallelization, here, means opening a hundred connections to the target server, and running them in parallel. Operating systems, nowadays, can handle many thousands of open connections; one hundred will not starve the machine.
Server-side, since all attempts target the same user the (hashed) password will be in RAM (it is normally stored in a file or database, but RAM is used as cache for frequently accessed data, and during a password-cracking session that data element will be very frequently accessed). Since we are talking about a weak remote web service, the server will store passwords "as is" or hashed with a simple hash function, not something like bcrypt. Rehashing each incoming password will use negligible CPU (an anemic PC can compute millions of hashes per second, so 1,000 hashes per second is nothing). Similarly, the HTTP header parsing will not use much CPU.
So 1000 password guesses per second seems a highly feasible figure; actually, it should not be too hard to exceed it by a large margin. Note the important points in the discussion above:
- The attacker can parallelize, since he has many passwords to try.
- Authentication occurs very early in the processing of an incoming request, so normal processing costs of requests (for whatever functionality the web service provides) do not apply).
- Usage of slow hashing function (like bcrypt) can be effective in such a situation.