QUICK SUMMARY:
It seems like modern websites, for security's sake, should all have the client hash their own passwords before sending them to the server, who will rehash them, in order to avoid leaking the original password(likely reused across different sites), and also to make decryption necessary on an individual basis, rather than simply decrypting a single SSL key.
ORIGINAL QUESTION:
Just did a quick check to make sure, and I was amazed to see that major websites appear to still be sending passwords in their original format for logins, albeit over SSL/TLS. I understand that they are hashing and salting/peppering them before putting them into the database. However, I still see an issue with that.
It seems as if hashing all passwords with a site-unique salt on the client-side before sending them, and then hashing the hash would be substantially more secure for the clients, especially in light of recent news:
The NSA and its allies routinely intercept such connections -- by the millions. According to an NSA document, the agency intended to crack 10 million intercepted https connections a day by late 2012. The intelligence services are particularly interested in the moment when a user types his or her password. By the end of 2012, the system was supposed to be able to "detect the presence of at least 100 password based encryption applications" in each instance some 20,000 times a month. (Emphasis added)
While I understand that from an efficiency perspective, having the client hash the password is unnecessary, I'm more interested in the fact that hashing the original password would mean that even if the data were intercepted, either on the server, pre-hashing, or during the connection and then decrypting the SSL, it would only be useful for logging into the website it was intercepted for.
I'd assume that should be a major concern considering lots of people people reuse passwords, yet these massive sites still are sending the original password and hashing it on their end. Is there a technical reason behind this, or is it just old practice that should ideally be changed?
EDIT FOR CLARIFICATION: I am not suggesting that these sites begin doing all the hashing on the client-side and just throw it into their DB as-is, they should definitely hash/salt the hash. My suggestion is that the client hashes the password so that the server has no idea what the original data was, and therefore the same password can be reused, and a compromise on one website would not mean a compromise to your password across other sites. As a nice bonus, it would also limit the access discovered by malicious proxies to the sites logged into, rather than handing them your password.