3

I've been asked to spec the server hardware requirements for the application I'm currently developing and I'm not confident in my own abilities to do this correctly. I would ideally like a step by step list of how to do this myself, but reading other similar questions, it is not that straight forward and I will probably get the best answers on here by just specifying what my requirements are.

The app is a WPF thin client which communicates via WCF to the server side application hosted in IIS. Currently there is little business logic requirements but this may well change in later phases of the project and may incorporate WWF for some of these requirements. I'm using NHibernate for the persistence layer and will be using AppFabric for second level caching. Finally the database will be a SQLServer 2008 R2 database.

The sites that I have been asked to spec for are going to have around 20 users. In order to keep costs down, they would like a single box solution which in terms of performance, with this small number of users I believe will be fine however, I appreciate the risks of this both from a security point of view as well as downtime etc. If I am being naive here please let me know.

Unfortunately at this stage I have no idea how much data will need to be stored in the database - ultimately can I just assume that the more data I need to store, the bigger hard-drive I will need?

If I have missed any valuable information then please let me know in the comments.

s1mm0t
  • 133
  • 4

1 Answers1

3

The common approach would be to examine the tree major possible bottlenecks "CPU power", "disk I/O performance" and "memory requirements", to estimate the approximate need for your specific case for each of them and to over-engineer by a degree to make you comfortable (which is a bit non-scientific, but of course would need an educated guess including predictions about future load and usage patterns and possible resource-hungry features).

The "amount of data" is most likely not to be a concern with today's hard drives' sizes - most likely you will be able to get an "entirely sufficient" amount of storage at a low cost.

The memory and I/O Performance bottlenecks are typically interconnected as RAM is used for caching the considerably slower hard disk I/O and hard disks are used as swap space in low-memory conditions.

In general, due to complexity of the algorithms in systems and libraries no estimation made at the drafting board will be as good as the examination of a live workload and the projection thereof. With today's opportunities in hosting and virtualization I personally would suggest simply using the "try before you buy" method and rent a Windows machine off the hosting market for this purpose.

the-wabbit
  • 40,319
  • 13
  • 105
  • 169