We're going to implement what I consider to be a fairly large and processor intense application in a few months and our Windows team is recommending that we deploy it on VM Ware. The application will have 5000 users, most of which will be running this application all day as the primary application they need to do their job. They will be scanning images which will be sent up to this application for processing.
The application has 4 parts: A web server, an extract server, an application server and a database server. The application server will be bearing the burden of doing optical character recognition against the 2 to 3 million images per day.
The application software vendor has not certified on VM and they don't have any experience with VM, although they seem to be open to supporting us on it.
I am not too concerned about the web servers or extract servers. But the application server part of the application states that it requires 3 Windows servers with 20 CPU cores and 32GB of RAM each. The SQL Server database also requires 20 CPU cores and 32GB.
I am not concerned about the network. We have a fast network and these images are not going to be very large. I am mostly concerned with CPU (for OCR) and I/O since all of these images need to be written to the database.
We do have a large VMWare platform (using the latest VMWare software) running hundreds of small applications, but our new application will be the largest by far. The current largest application that we currently have on VM only has about 200 users. So, we are concerend that we're going to push our VMWare environment much harder than it has ever been driven before. And I want to make sure that our end users are not impacted in any way by growing pains (e.g. slowness, outages) as we learn how to support this large application on VMWare.
My questions now are as follows.
Would we be better off running this on dedicated Windows machines rather than in a shared VMWare environment, at least at first?
Alternatively, should we insist on having our own dedicated VMWare platform at first so that we're not sharing with other applications during the roll-out period. And then once we see how it performs and how many resources it really needs, then we can start considering sharing.
If we do decide to run on VMWare, what are the likely downsides? For example, is it likely that the end users will experience slower performance? I've read that VMWare can introduce latency.
Should we bring in a consultant from VMWare to analyze our environment to make sure that it's properly sized and tuned?
Any recommendations on how best to proceed or other things to consider would be greatly appreciated.