I have a custom server application that runs on Windows 2008 R2. It is a home grown Windows Service written in .Net supporting a number of custom terminals. I have a test machine that has a similar specification to the live server and I have a set of client simulators that I can use to produce a load that is a reasonable approximation of the real system. I need to be able to support 12,000 of these and at present the server is running out of memory (Paging is going through the roof).
My plan was to only start 100 of the simulators, measure memory usage, then start 100 more measure memory again and repeat until paging starts going up (In reality I will be taking more than three data points.) This should give me a figure for the amount of extra memory required for 100 simulators and enable me to project how much memory is required. I only need a rough idea +/-30Gb to avoid buying the full 2Tb ($150,000 worth) that the server will take. My question is whether this is a reasonable method to use and if so which Performance Counters would you monitor to give the amount of memory actually being used?
I am specifically talking about memory here as the difference between Working Set, Private Bytes, Committed, Shared, Virtual and all the other memory terms confuse me. I think I can manage to monitor CPU, IO and Networking by myself. The other thing I have noticed is that the .Net Cache adjusts its memory usage depending upon what is available which makes spotting a trend hard to see.