3

I need to serve at least 400 concurrent users about 150MB of files (video/audio). What kind of hardware would you suggest is required? Planning to use Ubuntu for OS, and Apache for serving.

The usage is for internal network and not over internet.

I'm specifically looking for an idea about the HDD speed, amount of RAM, and Processor you think would be required.

  • is 150MB your whole library, or more like average size for each file? – Javier Oct 11 '10 at 14:49
  • 150 MB is average size for each file. All video are FLV, and audio are MP3. 400 users using the application we have built at same time, so may be 150 concurrent (same second) request is what we should be expecting. –  Oct 11 '10 at 19:55

3 Answers3

6

Ah - my favourite subject!

Presumably you'll just be playing back static, pre-encoded files right? well what you want to do is work out the average bit-rate of your content first, this leads the way for all the other things you'll need to work out.

Now for only 150MB of content you'll be able to cache that easily, so you won't have to worry about your disk speed (although this will change if you start to grow this content store). So what you need to know is how CPU intensive is this work (hint: probably not very if it's just fixed files - most of the time your CPUs will be waiting for the NICs) - that said you want at least two 2-3Ghz CPUs, probably more but not silly amounts unless you're expecting much growth or you're using the same machine to do transcoding (which is a bad idea anyway) - I'd stick with either a single socket Xeon (36xx series) or dual socket Xeon (56xx series).

You'll want 4GB of memory (it's cheap, any less is ghetto and unless the machine is doing any more work then going higher than 4GB right now is pointless).

Make sure you have at a mirrored pair of small'ish/slow'ish boot/OS disks and then have another mirrored pair of data disks - for now I'd save here, knowing you can get more/faster disks when content increased.

For OS, well whatever you choose there's no reason to go anything but 64-bit these days, if there's no 64-bit driver available for a component then don't put it in you machine - these guys have had half a decade to rewrite, if they can't do that they're not working hard enough for your £$€whatever.

Now onto the most important bit, NICs, you'll want two in a teamed pair to handle failures - go for a big name, ideally a server-class card that support things like interrupt-coalescence and TOE/LSO - these will help a lot. Now you need to figure out what speed these NICs are - there's really only two variants you should consider - 1Gbps and 10Gbps.

A 1Gbps NIC can send ~80-85MBps of traffic when fully driven - that works out at about 200KBps or ~2Mbps per user for 400 concurrent streams - which is quite a bit actually, it's roughly full screen SD quality. If your content is encoded for => this figure then I'd suggest you go to 10Gbps NICs on day one - they're generally not exactly ten times quicker as they're harder to 'fill' but they'll stop you having teething problems on day one.

Of course your entire network will need to handle that amount of traffic too don't forget; switches, routers, firewalls, load-balancers etc. will all need to be able to clear that sort of load - plus your actual internet links too of course.

Good luck.

BTW - I do this kind of thing for around 500k users, most at ~1.5Mbps (some at ~6Mbps).

Chopper3
  • 100,240
  • 9
  • 106
  • 238
1

Sending files is an IO activity so your CPU will not incur a huge hit from pulling a file and pushing it to your users.

I would suspect off hand though you would want any current CPU, a good amount of ram (more the better for a server) and HDD speed the higher the better.

Although off hand I do not have any specific numbers to give. Just my initial thoughts here, hopefully someone can expand by providing concrete numbers/data.

Chris
  • 419
  • 1
  • 4
  • 14
  • Can't give concrete numbers without knowing how the OP's application is working, and in best case, whether the OP tested it out on a smaller scale to see how it performed in a test situation. 400 users occasionally hitting a server for downloading a file is different from 400 users simultaneously hitting a server streaming HD video, 150 meg files or not. – Bart Silverstrim Oct 11 '10 at 12:30
  • I took it as 400 users downloading the file at the same time due to "concurrent" but as you mention this is probably not the case. – Chris Oct 11 '10 at 13:09
1

You don't mention things like compression, codecs, are you just copying files over, streaming them, farm them out or single server, are these simultaneous users, etc...

The best advice I'd say is get the fastest drives you can get, maybe RAID 10 them (don't know if this is a business critical system) with hardware RAID. Get the best quad-core processor you can get, as it probably isn't going to be as heavy on the processor as on the drive subsystem and network card. Be more concerned about your network card being top-quality and gigabit speed (get two and team them up if possible with a correctly configured Cisco switch). Get as much memory as possible for caching purposes, 64 gig or more.

The reality is that it depends on your actual usage. Have you tested this on a smaller scale? If so, on what hardware, and how did it perform? When do your users complain about performance? What bottlenecks did you encounter?

This question as posed is a little vague on details of your implementation.

Bart Silverstrim
  • 31,092
  • 9
  • 65
  • 87