Stricto sensu, you cannot really have a generic test. In HTTP, the client announces whether it supports compression with an Accept-Encoding
header line. The server will then feel allowed to use these compression schemes. @Adnan points to this blog post which describes how one can manually send a HTTP request to a server and see what the server responds with. As an audit method, to check for HTTP compression support, this has two problems:
There is no finite list of possible compression algorithms. At least gzip
, deflate
and compress
are common. With TLS-level compression, at least, each "compression method" was specified by a single byte, so there was only 255 possible compression methods (not counting the "no compression" method), and it was possible to be exhaustive. This is not the case here.
The server is free to apply compression or not, as it sees fit, on any document. For instance, the Apache documentation shows how the decision to compress can be made to depend on the type of target document but also its emplacement in the site directory tree. IIS makes a distinction between "static compression" and "dynamic compression": static compression is applied only on pages or document which are physically files on the disk. In the case of IIS, the decision to compress, or not, is then taken depending on how the requested page is generated, so it can vary for each individual URL...
Not having much details on the attack yet, I still feel as if it is, by nature, very specific to each target site, so we have a few days before us; there is no need to already panic and declare a ban on compression. Let's first get some details on the actual attack. In other words, the "emerging advice" seems a bit premature to me.