You would need to find (or hire a service) that simulates users using your website concurrently. You can't do it in general without knowing how much data can be pulled off of it, and that's very specific to your application. How much static content is pulled? Video? Graphics? Generated data?
Then you'd need to figure out how much data can fit through your Internet connection, if your application is doing server-side work whether it'll max your processor or memory, are you using a database? What about your disk storage system? Have you normalized the database?
Most decent hardware today with a decent connection to the Internet should work for most server work, as I doubt most startups are going to hit Google or Youtube or Facebook traffic overnight, so I'd focus more on having an application and server designed to scale up as needed. You can get some idea of the hardware used on Serverfault through the Stack Exchange blog, and they're not really using a lot of heavy duty hardware to serve the sites. But again; it's highly dependent on the application design and the data being pushed to users.
In the end the only "easy" way to tell is to test it. There is no magic formula or program to test it locally.