1

I have a HTTP server that serves images generated from a database. It has some level of caching built in, etc. It generates map tile images for javascript based web maps (a la Google Maps etc.) We're thinking of switching to the underlying technology, but the same results will be coming out. However we'd like to load test it first, to ensure that the new tech is able to perform about as well as the old tech.

We have lots of apache access logs from our production servers, in standard/default apache log format. I'd like to use these URLs (& timestamps) as a stress test for the new server, by replaying the same URLs, in order, with the same delay between requests. I want to make a request to the same URL, but I alway want them to be (as close as possible) to the same order, and with similar delay between requests, to simulate real world user behaviour, rather than just fire all URLs at the same time.

Is any open source software running on linux, which will take an apache log file, parse out the URLs (I can do that already with my apache-log-parser python library), and then "replay" the HTTP requests against the server, with appropriate delay between requests.

The output I'd like to get is approximately average response rates from the server, to see if it can handle the real world load.

I have Munin on the server so I can check for load.

Amandasaurus
  • 30,211
  • 62
  • 184
  • 246

0 Answers0