3

I'm researching XSS vulnerabilities in a web application which uses continuations. That means that for a given form, the URI the form data is posted to is unique and different every time.

A first GET request displays the form with its unique URI such as:

http://test.local/webapp/4b69615449222508116a1e562e1e0a458e4d6351.continue

Then the submit action does the POST request.

Is there any free (or better open source) security scanner which understands continuations and is able to do the GET request before each POST request the fuzzer is trying to send ?

Thomasleveil
  • 131
  • 3

2 Answers2

2

Yes most scanners can do this.

For example, Burp Suite supports macros which allow you to configure a sequence of requests. The support is quite neat as various different Burp tools (scanner, intruder, repeater, etc.) automatically make use of the macros transparently.

paj28
  • 32,736
  • 8
  • 92
  • 130
0

HP WebInspect may be able to do it with some serious tweaking of configuration. Without having the web application on hand to experiment with (or for that matter access to WebInspect at the moment, or having last worked with it a while back), the following is a rough guess of how you may be able to change configuration to make it work:

  • Change crawler mode from depth-first to breadth-first. This would force crawler and auditor to retrace the path from root of scan (start url) to the page being requested. Be warned that this substantially increases the number of requests to be made and can really slow down the scan (so you may need to limit the scan to just that path). You may also need to couple it with http parsing settings by marking that url segment as a 'state parameter'. This will instruct WebInspect to recognize the URL, and substitute the last seen value in from the one that was originally noted.
  • Use workflow macro (again coupled with 'state parameters' parsing). This will limit scan to exactly what's recorded in the maco and thus may help avoid using breadth-first crawler (and thus be faster).
  • Since I last worked with it, the 10.xx releases now include user interaction based workflows rather than traffic replay workflows. This may help with continuations (since it's no longer navigating urls, but rather recorded user interactions with web application) as well (in fact, I'd try it first).

But with enough tenacity, I think it can be accomplished. Use built-in traffic monitor or a proxy to see what it is doing to see how your settings changes are affecting it and to see if it is properly using the latest value in the form url.

LB2
  • 420
  • 2
  • 8