4

Do you know any crawler/spider software, which is able to go through ASP.NET application, not ASP.NET MVC, but the one full of __doPostBack JavaScript functions on every link/button/change event?

If there is no such application - how do you start inspecting ASP.NET web sites? My approach is usually to search all *.aspx files in the application folder and the one-by-one checking if it is possible to reach the page without parameters or trying to reach it through web site GUI, which is real time killer.

e-sushi
  • 1,296
  • 2
  • 14
  • 41
bretik
  • 1,840
  • 13
  • 22
  • 1
    When I want to perform penetration testing on ASP.NET application, this is the first step. I think it is relevant for web application security to know how to effectively test your application. – bretik Nov 12 '10 at 05:36
  • test what? if the page is reachable or secure. if you mean secure then there are web scanners, check this thread ( http://security.stackexchange.com/questions/130/what-are-some-good-website-security-scanning-solutions ), else it belongs to stackoverflow – Mohamed Nov 12 '10 at 05:49
  • OK, I'm not searching for a security scanner - I want to enumerate all pages present in web application to be able to start testing them. But if there exists any security scanner, which is able to go through ASP.NET web site, I'd love to know about it. – bretik Nov 12 '10 at 06:06
  • 1
    checkout @rory answer on this thread http://security.stackexchange.com/questions/32/what-tools-are-available-to-assess-the-security-of-a-web-application/38#38 – Mohamed Nov 12 '10 at 06:16
  • I will check these tools - I've already tried about half of them, only few of them are useful for creating the "site map". But I don't see any tool there, which is explicitly stating, it can handle ASP.NET. – bretik Nov 12 '10 at 06:32
  • Have found that Netsparker Community or Pro editions work on ASP.NET apps especially well – atdre Apr 16 '15 at 19:08

2 Answers2

4

JavaScript links are a real problem for automated spidering. Personally I tend to use Burp suite and manually build up a page list by navigating the site, and then use page by page scanning.

In theory the way to do it would be to use a spider which relies on the browser engine itself to do the spidering (so some of the tools which "drive" a browser might be useful).

Another option would be to get a JavaScript parsing engine available in your programming language. One thing I saw which may be useful on that front is this project from rubyracer, which allows access to a JavaScript parsing engine from within ruby. I've not tried it , but I was thinking that it may be possible to use something like that to evaluate JavaScript on form submissions and extract the appropriate URL for the spider to follow from that.

kalina
  • 3,354
  • 5
  • 20
  • 36
Rory McCune
  • 60,923
  • 14
  • 136
  • 217
  • @Rory, can you explain why javascripts links are a real problem for spidering? – one Sep 06 '16 at 05:58
  • 1
    the main problem is that traditional spiders aren't JavaScript aware, so they don't parse JavaScript code to find links. Say for example you have a page that when a form is submitted, executes a function which decides where to send the user based on the contents of the form. Unless the spider can fill in that form with valid data and then execute the JavaScript code an evaluate where the user would be sent, it can't complete the spidering process. – Rory McCune Sep 06 '16 at 14:11
0

Software quality engineers/testers don't rely on crawlers and spiders to test for their bugs -- I don't believe that application security engineers/testers should either...

Instead, SQEs rely heavily on dev-testing frameworks (or test harnesses) such as Selenium RC / Bromine, Watir/WatiN/Watij, Sahi, HtmlUnit, or WebDriver. Some go for the higher-end commercial QA tools such as HP QTP, IBM Rational Functional Tester, TestComplete, and VisualStudio Tester Edition 2010.

SQEs do not typically automate exercising an app for execution flow because:

  • Testing requires exploratory discovery, like a tourist would do in a new city they've never been to, in order to scope conclusions and determine outcomes, especially when on a time schedule (which many testers are)
  • Concepts such as equivalence classification can save testers hours of work by not having to repeat the same mistakes over and over again. Security testing is a little different in that we have to test everything, but certainly we don't do this today and could utilize this technique, especially when time-boxed

The problem with test automation is that apps change quickly and the test harness must usually be continually modified in order to keep up with these changes. In Agile methodologies such as ICONIX, robustness tests are code generated from domain models and sequence diagrams (usually in UML), but certainly there are plenty of ways to automate the rebuilding of test cases during code churn and new builds, however this more than often not probably requires metaprogramming.

atdre
  • 18,885
  • 6
  • 58
  • 107