3

For a few weeks we observed a collision in session ID generation, resulting in two operators independently connected to a test web application sharing the same session. We investigated the issue and exercised the session ID generation to generate between 500 000 and 1 000 000 session IDs with various generation settings. Some settings are deliberately bad, to obtain some references of bad values. The result of this experiment is 5 files holding Session IDs (hexadecimal strings), one on a line, each file holding between 500 000 and 1 000 000 records.

Which numerical indicators can I derive from these series and use as a proxy of security quality which means for me (1) knowing how unlikely are collisions and (2) how hard is session prediction.

  • Have you got access to the application source code? A similar question has already been asked: [Identifying, analyzing, and predicting weak session cookies](https://security.stackexchange.com/questions/23274/identifying-analyzing-and-predicting-weak-session-cookies). The conclusion of it is that, while there indeed exists some tools to analyse such data file, the preferred way remains to analyse how these session ID are generated and which source of entropy is being used. – WhiteWinterWolf Jul 22 '15 at 14:42
  • @WhiteWinterWolf Yes I have access to the source code, which is the standard PHP session generator, which computes a hash involving IP, some date elements and output of `/dev/urandom`. This code runs natively on Linux/BSD and on Docker Containers in virtualised environments (VirtualBOX, EC2). I want to measure the influence of `/dev/urandom` on the function – but maybe it is easier to start with measuring the quality of `/dev/urandom` ? – Penelopa Koyfman Jul 22 '15 at 15:01

1 Answers1

2

Following your comments, it seems strange you got duplicates if you are actually using /dev/urandom as entropy source and sufficiently long session IDs.

  • Are you sure that PHP is correctly configured? As of PHP 5.4.0, PHP defaults are now secured, but it happens that some historical server settings remains over time and decrease the security compared to current defaults (PHP may not be actually using /dev/urandom, or it may gather to few entropy from it),
  • Are you sure that the issue hasn't been caused by a third party issue, like unvoluntary session fixation ? Such a bug for instance could occur if the session ID can be passed as URL parameter, and that URL containing the session is shared by some way, either explicitly (URL sent from one person to another) or by some unexpected way (content caching at the server or a proxy for instance).

/dev/urandom can be generally considered as a trusted entropy source. If by any chance you still want to investigate it, you can use ENT tool for this purpose. You can find a usage example on this page.

A quick an dirty check on your file would be to ensure that there was at least no collision encountered during your session ID generation test:

LC_ALL=C sort /path/to/sample_file | uniq -d

Replace /path/to/sample_file with your actual file containing your generated session ID, it should output all duplicated entries. Under normal circumstances it should therefore produce no output.

Testing the actual security quality of these files would be more tricky. A session ID must not only be unique, but it must not be predictible, this means that it should be evenly and randomly spread over a sufficiently large space of possibilities. This may be difficult to properly demonstrate. As said in my comment, there has already been a discussion on such topic, and while some web application security tools offer session ID analysis, the conclusion seemed that the most reliable way remains to analyse the application code and configuration itself.

WhiteWinterWolf
  • 19,082
  • 4
  • 58
  • 104