(ignoring the productivity side of things, b/c that is developer specific).
First off, you shouldn't do that. Experience shows that having successfully built a alike product on the platform before is the number one predictor of success. Pick your team first, on their merits. Unless your team is suggesting a crazy outlandish platform, then go with whatever platform they're familiar with and recommending.
Now for argument's sake let's say a single server can handle 1K users.
I think that's your central mistake. How many users a server can handle will of course totally and completely depend on the application. But for most webapps >98% of the users are not logged in at any given time. A basic web application server can often handle 200 - 800 request pr second. Do the math, and you'll see that many applications can have several tens of thousands of users per server.
To give you an example you're familiar with, the Stack Overflow family of site (of which this site is one) handles more than 6 million unique visitors each month, and does many more pageviews than that. And here is the full server farm for it.
Commercial open-source costs money too. Take a proper look at MySQL, Ubuntu, Red Hat Linux, JBoss and all the others -- you will typically want to have an expert on staff at all times, or purchase a support subscription from the backing company.
Last but not least, in the end people costs tend to completely dominate for <200 server installations. If you build out your own infrastructure -- as opposed to using cloud computing or a full-service provider -- the salaries and hours spent on sysadmins will generally exceed your software costs. For Google with their 250,000+ servers it's a completely different matter, but for smaller installations this generally holds true.
One major caveat: Some types of sites, such as social networks and video transcoding sites, have very different hardware needs than what I describe above. Do a proper analysis of your specific application.