8

Anyone have any suggestions for methods of load-testing wireless networks for enterprise network deployments?

We've got a wide array of different wireless worst-case scenarios to support.

  • rooms with either nobody or 300 people, each of whom have 1-3 wireless devices and they want to use all of them, potentially at once (large classrooms)
  • buildings with lots of people who want to bit-torrent the latest episodes of "my so called life" or "shania twain" albums using their brand-new 802.11N laptops, all while playing counter-strike (student dorms).
  • large spaces where people expect to roam from building to building, all in a large metro area with lots of non-enterprise 802.11 networks and other noise.

So, we're planning on buying a new generation of 802.11 network and we're not happy with our existing vendor, so it's going to be a forklift upgrade.

Is there any way of simulating these above use-cases without buying 800 laptops and hiring student droids to "pretend" to surf the internet or do homework? When we have bakeoffs, it would be great if we could do things in a controlled and repeatable environment.

I'm aware of the ixia testing devices which seem to kinda do what we want, but without expertise in how to properly do the testing, I'm not sure we'd get it right. So, in addition to alternatives to this thing (or 5 of these things randomly placed around the building), are there any firms that come out and do site-wide testing?

Also, let's say I've got a linux / freebsd box with an atheros wireless radio -- are there any good tools for inspecting existing wireless traffic to give breakdowns of spectrum utilization, packet retransmissions, and so on?

chris
  • 11,784
  • 6
  • 41
  • 51
  • In a small area like that you will be doing well to get 30 at a time working nicely, forget about 300. – JamesRyan Feb 09 '10 at 01:47
  • It really really depends on how many APs you're using, the power they're transmitting at, the bands the clients are using, and how the frequency is managed. A nice test bench would be good for identifying which wireless solution works best in a worst case scenario in a controlled and repeatable way. Or at least, that's the ideal... – chris Feb 09 '10 at 03:06

3 Answers3

1

Interesting question!

I worked with some very smart people at TechEd Australia trying to resolve poor wireless performance issues in the Gold Coast Conevention Center and some detailed analysis on spectrum usage was required. It is all documented in this blog post here - http://www.techedbackstage.net/2009/07/15/diagnosing-and-resolving-extremely-high-rf-utilisation/

Although this focuses on improving existing wireless infrastructure it may give you a good starting point.

Best of luck

commandbreak
  • 969
  • 4
  • 6
1

I think Ixia has an even better solution than their devices for your testing. Take a look at IxChariot, http://www.IxChariot.com . This is the solution set used for WiFi alliance certification testing so it should more than meet your needs in software. You can use one network (hardwired) for test setup and results collection and another (Wireless) for your actual testing.

IxChariot is an all software solution, but if you need MAC Address generation then you are back to looking at hardware load generators.

James Pulley
  • 456
  • 2
  • 6
0

I just spoke to some people who frequent the NJ Sysadmins List (http://www.njsysadmins.org) who work in schools (K-12 typically) and run benchmarks on the time it takes them to connect to the network and load their remote desktops from a centralized server.

What they did was load a cart with 30 laptops into the room, and power them all on simultaneously. The metrics they were looking for were "time to bind" to the AD server, and "time to full desktop". They charted the minimum time and the maximum time for varying loads (30 laptops and 60 laptops).

I don't think that kind of testing scales up, but in their rooms, they rarely get more than 30 laptops at once.

Are you gathering metrics right now? It might be possible to extrapolate on those, and carefully change variables to determine what causes performance increases and what doesn't.

Matt Simmons
  • 20,218
  • 10
  • 67
  • 114