If you are performing load testing at high packet rates, the most safe way is to isolate it completely from the rest of the network. For example, you can connect two servers by direct 10GBps link without switch, and use another LAN connection on benchmarking server to ssh
to one server to run the test.
Another way is to provision servers in Public Cloud like AWS for short period of time and run it there. You don't risk impacting your infrastructure. It's OK to do it, cloud infrastructures are resilient so if you flood your server with packets it's just small bit in a lot more of traffic. Of course if would be good to provision big, fast machine, or even dedicated one so that you can really see how much load your software takes.
Regarding software, you can use generic ab
which is "Apache Bench" and it's part of Apache packages. You can test resilience of your web server (static files from httpd) and application server (dynamic ones from PHP, Ruby, Java).
You might look for other, professional load testing software for specific protocols which may include API and video streaming testing. There are many of them depending on what protocol you are using (like REST etc).
Then proceed with usual network stack tuning as well application tuning to gain more performance (same on the client side like ulimit
). Keep track record of your results. And also, try to analyze the logs properly with either free software like AWStats or commercial Sawmill. Avoid using grep ;-) Analytics is what will show you real results. Also note that ab
is good for httpd
but sometimes has issues with other web servers. And in AWS you can use Load Balancer which shows real-time stats as well. Record network stats with Nagios, Zabbix etc to see how the network stack performed (e.g. dropped connections, packet rate, CPU usage etc).
Never hire DDoS attacks as these are criminal gangs and you risk a lot, not only legally for yourself but also for your ISP, your business and your customers.