I was reading an answer by Terry Chia to another question about the requirements for a salt, in which he/she (among others) specified that salts need to be globally unique, but don't go into any detail on why. This was defined as "not only unique in the context of your database, but unique in the context of every single database out there". For example, I have been using an email address as a salt for my experiment-y sandbox website/database. These could certainly be being used as salts on other websites.
It doesn't make sense to me because, as I understand it, the purpose of the salt is to ensure that "cracking multiple compromised password hashes together should be no easier than cracking each one of them separately" (Ilmari Karonen, further down on the same post). Unless I'm mistaken, this constraint is satisfied as long as my salts are unique within my database.
I am further confused because generating globally unique salts sounds impossible for me to control. How am I supposed to be able to guarantee that none of my salts have ever been used on someone else's site, with the millions of sites out there? To be fair, I haven't looked into how to generate random salts yet; maybe the idea is that they are created from a large enough pool that even considering the number of sites out there, duplicate salts generated for separate databases is very unlikely?
I am definitely biased towards using an email address as my salt, since it's one less small step of work, but given the number of people who seem to disagree with this approach, I recognize that I am probably wrong. And besides, I want to know more about why I'm wrong because it doesn't make sense yet.