37

I am looking for a tool to test a website from a Linux command line.

From the output, I need to know the http response (status codes) but also benchmark the time it takes to download the different elements of the site.

Thank you in advance.

Embreau
  • 1,277
  • 1
  • 9
  • 10

10 Answers10

44

You can try wget with -p option:

wget -p http://site.com

It will tell you how long it takes to download each element and the return codes for each request.

Dan Andreatta
  • 5,384
  • 2
  • 23
  • 14
9

Please see Apache Benchmark:

Apache Benchmark

This should give you an overview of your page's performance.

andre
  • 235
  • 1
  • 2
  • 8
  • Apache benckmark cannot be use for this since it is an external source, in fact it is youtube video playlist. We are monitoring the access to this ressource. TY – Embreau Mar 22 '10 at 14:42
  • I don't see why you can't use ab; like wget in the next answer, it will work as long as the URL of your site is accessible from the machine where you're runinng the benchmarking tool. – gareth_bowles Mar 22 '10 at 14:46
  • Apache benchmark is not restricted to local resources, it's meant to be a full performance measuring tool (including network latency, i/o, etc). – andre Mar 22 '10 at 14:52
  • Good to know, I will evaluate this option, thank you. – Embreau Mar 22 '10 at 16:31
8

You may want to look at the following options of curl:

  • --write-out - displays any of several time-related variables
  • --trace-time - Prepends a time stamp to each trace or verbose line
  • --verbose
  • --include - (HTTP) Include the HTTP-header in the output.
  • --trace-ascii <file> - Enables a full trace dump of all incoming and outgoing data, including descriptive information

And the following option of wget:

  • --timestamping - Turn on time-stamping
Dennis Williamson
  • 60,515
  • 14
  • 113
  • 148
3

Selenium and Curl are good options depending on what your goal is. Also, a utility that I've come to like quite a bit is twill. More information is available at http://twill.idyll.org/.

It's nice as it has it's own little specialized language for filling out forms, validating links, and checking response codes. Since it's just Python code, you can easily import the libraries and automate your tests yourself if you'd like to do something different.

McJeff
  • 2,019
  • 13
  • 11
2

Use curl to get the header for the page, and time the process:

time curl -I http://yourpage.com | grep HTTP

wrap that in a while loop and you're good to go. The same way you can check for all elements if you know the URL.

fedorqui
  • 248
  • 4
  • 17
Sideshowcoder
  • 513
  • 2
  • 8
1

try a commandline tool called 'siege' as instructed here

lefterav
  • 233
  • 2
  • 8
1

What tool you choose depends on what you want to measure and the complexity of the site.

If the behaviour of the site is dependent on cookies (e.g. user needs to login) then ab / curl / wget (described in other answers) will not suffice. One solution is to use http::recorder / www::mechanize.

All the data you are asking for is in your webserver logs - and a simple awk script will return it in a more readable form.

benchmark the time it takes to download the different elements of the site.

This is a very poor indicator of performance (although it is useful for monitoring the health of a production system). With the exception of large/slow resources such as bulky reports, iso images, multimedia files the perception of performance has got very little to do with the time taken to process a single request - and its really difficult to measure this accurately (simply adding %D to your apache log appears to solve the problem but ignores TCP handshakes, SSL negotiation, caching effects, DNS lookup times).

A better solution is to use something like Boomerang - but that runs in a Javascript capable browser. While this gives a better indicator of perceived performance than tracking individual HTTP requests, it relies on browser events to derive a value for the performance - but peceived performance is all about the time taken for the viewport to render (again there are tools for this - have a look at the filmstrip tools in WebPageTest).

There is also the argument about measuring the performance actually delivered to users of the site (RUM) vs synthetic testing.

symcbean
  • 19,931
  • 1
  • 29
  • 49
0

If you are going to need something bigger then curl and/or wget, there is also selenium

Unreason
  • 1,146
  • 1
  • 7
  • 22
0

I think for running performance test you can try JMeter. You can record your test using the built in proxy. It also runs in text mode, local or distributed. You can save your results in csv or xml format. If using xml format you can also store the content of the page.

ghm1014
  • 944
  • 1
  • 5
  • 14
0

For checking headers, I like httpie (docs).

Installation

pip install httpie --user

Usage

$ http -h http://serverfault.com/q/124952/113899
HTTP/1.1 302 Found
Accept-Ranges: bytes
Age: 0
Cache-Control: private
Connection: keep-alive
Content-Length: 198
Content-Type: text/html; charset=utf-8
Date: Fri, 06 Jan 2017 10:01:06 GMT
Location: http://serverfault.com/questions/124952/testing-a-website-from-linux-command-line
Set-Cookie: prov=392298d9-103e-7dfc-5b55-8738be46cf3b; domain=.serverfault.com; expires=Fri, 01-Jan-2055 00:00:00 GMT; path=/; HttpOnly
Via: 1.1 varnish
X-Cache: MISS
X-Cache-Hits: 0
X-DNS-Prefetch-Control: off
X-Frame-Options: SAMEORIGIN
X-Request-Guid: 07a25de4-6e16-4aa8-acfc-12b1bbbc6180
X-Served-By: cache-hhn1543-HHN
X-Timer: S1483696865.976259,VS0,VE187
Martin Thoma
  • 247
  • 4
  • 13