2

Suppose in a peak hour there are 100,000 users that want to watch a video and that on average they consume 10 megabytes per video view. Might be more, might be less. For simplicity, let's say 1 terabyte needs to be served in that hour. That would make a sustained rate of 2222 megabits per second.

The video does not need to be streamed with specific tools, we plan on using some nginx/lighttpd pseudostreaming. Because they are expensive at that rate, we rather do not want to use a CDN.

How many servers do we need (for the network traffic alone), speaking in terms of Xeon quad core servers with 1 gbit/sec connection? What's the maximum on a gigabit connection?

Edit:

To give you more details: In this peak hour, there are maybe a dozen videos that are accessed. The actual HTML that contains the player etc. is memcached. We already had that much traffic, so that part works. We 'just' need to do it with video now without exploding costs.

webjunkie
  • 159
  • 7
  • 2
    "What's the maximum on a gigabit connection?" That would be 1Gbps, hence the name. You'd need at least 3 separate gigabit connections to hand out 2.22Gbps; and you'd want at least one extra for redundancy. You'll want some room to grown, hence Chopper's Answer. – Chris S Jan 25 '12 at 13:48
  • Just to check we're on the same page: several 1Gbit lines that you intend to keep fully loaded at all times are *not* expensive, then? (and does your provider have sufficient peerings so that your 1Gbit of traffic is not routed through East Elbonia and back, en route to your users?) – Piskvor left the building Jan 25 '12 at 13:56
  • @Piskvor: No, the 1 gbit lines are not that expensive, and because we do not need to geopgraphically distribute the traffic, we could use included traffic from our hosting partner and additionally buy it for 1/10 of CDN costs. – webjunkie Jan 25 '12 at 14:05
  • I think you're looking at about $1950 USD a month, per 1gig line. - roughly, so $5850 a month, plus hardware (both network and servers). – Sirex Jan 25 '12 at 14:09
  • @Sirex: Our hosting partner charges ~$50/month to upgrade from 100 Mbit/s to 1 Gbit/s. – webjunkie Jan 25 '12 at 14:13
  • @Chris S: I don't think 1 Gbit is the maximum. With routing information and other overhead it will likely be a fraction. That's why I want to hear from someone with experience. – webjunkie Jan 25 '12 at 14:18
  • @webjunkie: Aha, thanks for the clarification. – Piskvor left the building Jan 25 '12 at 14:21
  • @webjunkie Yes, you lose about 20% to overhead; that's pretty standard stuff. At my provider the lines are basically free, you pay for transfer (which would be a few thousand a month for you). But I know of other providers that could do this cheaper, as Sirex said. – Chris S Jan 25 '12 at 14:33
  • @webjunkie You're probably just paying to upgrade the maximum connection speed of your server, but your actual bandwidth may still be in terms of GBs or TBs. If you're being promised unlimited 1gbps for just 50$ a month more, you may want to look at the small print. – gekkz Jan 25 '12 at 14:46

1 Answers1

13

One single socket quad-core Xeon with a PCIe x8 based 10Gbps ethernet NIC will be able to deliver that 2.2Gbps easily using either Windows or Linux without breaking a sweat. Of course that's if you have more than 1Gbps of bandwidth - you've limited yourself in this scenario by only having 1Gbps available so that's the wall you'll hit.

The complex bit comes if those 100k views come from a library of thousands of video clips as it's the storage part that needs to keep up with the CPU/bus/NIC chain.

So that answers this question but what you now need to tell us is the storage metrics and we can work on that for you.

  • total storage
  • max no. of videos
  • min/ave/max size of videos
  • ideally codex/s used
  • memory in server
  • nature of those 100k plays - i.e. split over how many of the stored videos
Chopper3
  • 100,240
  • 9
  • 106
  • 238
  • Thanks for you help. I added additional information. But please don't focus on the storage stuff. The traffic will hit almost static pages and be spread over a dozen videos or so. I hope that helps. – webjunkie Jan 25 '12 at 14:16
  • If you can memory-map/cache your video then my statement still stands that even cheapo modern kit will flood that kind of traffic no problem. I've been doing VoD systems for ~6-years now and have ~1m paying users playing back from a library of 14,000 SD & HD assets across the country - I'm jealous of your much simpler requirements :) good luck – Chopper3 Jan 25 '12 at 14:21
  • But there won't be a 10Gbps server available... so just load balance it accross 3-4? – webjunkie Jan 25 '12 at 14:39
  • Load balancing should work fine, you should be able to do this when embedding the player by outputting a random server address from your cluster as the video source, or using DNS via round-robin, or using a physical load balancer, though the latter will likely cost you. – gekkz Jan 25 '12 at 14:49
  • LB'ing works fine, we do it, but it does mean you have to manage some form of centralised storage for the content or internal content distrution and managing cache-coherency. – Chopper3 Jan 25 '12 at 15:14
  • I guess I could just rsync the videos accross all servers for now...? – webjunkie Jan 31 '12 at 13:39
  • I think you'll find that your hosting provider will not like you using anything close to 1 Gbps per server sustained, check your terms of service. Also, I suspect unless they're a really large provider, they will have at most 2-3 10 Gbps pipes into the datacenter from different providers. You'll be competing with other customers for those limited resources. – rmalayter Apr 16 '12 at 22:00