12

My client has a small "Contact Us" form on every page of their website. They are adamant about not including CAPTCHA verification on these forms, to keep them easy to use, but I feel it is my responsibility to implement some type of security against brute force attacks, etc. What are my options here, especially those that involve no changes to the UI of the forms.

FYI, the client has ColdFusion 8 as their server-side scripting language, which is currently used to insert the data into the database. The solution doesn't have to be specific to ColdFusion, though. I'm looking for ideas here - not necessarily code snippets.

AviD
  • 72,138
  • 22
  • 136
  • 218
Eric Belair
  • 281
  • 2
  • 6

4 Answers4

13

This is a good summary article on CATPCHA:

http://www.smashingmagazine.com/2011/03/04/in-search-of-the-perfect-captcha/

I think you are right not to use CAPTCHA. Research has show that CAPTCHA can lower your conversion rates by 3% and potentially up to 30%. Even employing someone to manually filter or using mechanical turk, yourmaninindia.com etc maybe cheaper and better for your customer experience

Good passive measures:

  • The Roboo script. Demoed at Blackhat this year and gives you some DOS protection as well.

  • Adding a honeypot hidden field that should never be completed. You can use Javascript to fill this in automatically on legitimate form submissions and validate server side.

  • Velocity measures such as how fast the form is submitted.

All these are not perfect, can cause problems for disabled people and can give you false positives. If you are still getting an unacceptable amount of spam or robot usage then use the most popular and free option Re-Captcha. Ideally combine this with the passive steps above to only present the CAPTCHA if you suspect a bot submission (i.e. an adaptive approach not presented every time to users). Yes Re-Captcha has been broken in research labs with as much success as 25% using OCR techniques but it is still enough to deter dragnet bots seeking unprotected targets (running faster than the guy running from the bear). Also you are helping Google translate the worlds books and until duolingo.com comes up with its next innovation that is not a bad thing.

gMale
  • 103
  • 4
Rakkhi
  • 5,783
  • 1
  • 23
  • 47
  • +1 for Roboo, which is a great tool. Requires nginx though – atdre Jun 01 '11 at 18:57
  • 1
    we use many of these techniques in Stack Exchange forms! However, I am curious about "Re-Captcha has been broken in research labs with as much success as 25%" -- do you have a citation for that? I searched and could not find anything on the web that matched. – Jeff Atwood Jun 01 '11 at 21:38
  • 2
    @jeff-atwood here you go: http://bitland.net/captcha.pdf – Rakkhi Jun 02 '11 at 08:34
  • There is some good stuff here, but also some bad stuff. The "hidden field" is nearly pointless, the only questionable value is preventing drive-by brute-force, and that only those that are not smart enough to handle hidden fields. Really anything clientside... – AviD Jun 06 '11 at 16:50
  • The Roboo script is very interesting, its the right direction - I was going to recommend basic throttling, but Roboo takes that a couple steps further. (And, turns out it's by a friend of mine...wasnt aware of this, so thanks!) – AviD Jun 06 '11 at 16:52
  • @Rakkhi: You might be interested to also know that re-Captcha, aka Google, now tells you which of the two words in the captcha is being authenticated; it's the one with the drop-shadow background, the other word may be random text of any length; which makes no sense, and believe is a new feature this year. Also, my understanding is that re-Captcha was having the same sort of failure ratios with the audio version in it (the little speaker button); thought can't recall where I saw the research paper breaking it. – blunders Jun 06 '11 at 17:51
  • @blunders thanks will keep that in mind for future use. – Rakkhi Jun 08 '11 at 22:10
  • 1
    Went with honeypot and it works excellently. The client is very happy. – Eric Belair Feb 08 '12 at 17:51
  • @EricBelair great outcome! – Rakkhi Feb 14 '12 at 21:42
  • +1 for the honeypot idea, I had never thought of that. – lynks Mar 13 '13 at 17:19
3

NoBot is an interesting FREE solution from Microsoft

Description

NoBot is a control that attempts to provide CAPTCHA-like bot/spam prevention without requiring any user interaction. This approach is easier to bypass than an implementation that requires actual human intervention, but NoBot has the benefit of being completely invisible. NoBot is probably most relevant for low-traffic sites where blog/comment spam is a problem and 100% effectiveness is not required.

NoBot employs a few different anti-bot techniques: Forcing the client's browser to perform a configurable JavaScript calculation and verifying the result as part of the postback. (Ex: the calculation may be a simple numeric one, or may also involve the DOM for added assurance that a browser is involved) Enforcing a configurable delay between when a form is requested and when it can be posted back. (Ex: a human is unlikely to complete a form in less than two seconds) Enforcing a configurable limit to the number of acceptable requests per IP address per unit of time. (Ex: a human is unlikely to submit the same form more than five times in one minute)

NoBot can be tested by violating any of the above techniques: posting back quickly, posting back many times, or disabling JavaScript in the browser.

makerofthings7
  • 50,090
  • 54
  • 250
  • 536
2

I believe that OWASP has a few resources for anti-automation.

For your particular scenario, I might recommend using some sort of token in every HTML link (such as the HTML form submission criteria for the contact form) and writing code in your dynamic pages to add a Referer header so that you can later check for the existence of a Referer header which matches the previous token.

However, by doing so you could create a lot of issues. For example, HTTP header injection could occur. You of course have the problems that every dynamic page has, such as injection flaws, cross-site scripting, path leakage -- but you also add concurrency issues, exception handling problems, etc.

atdre
  • 18,885
  • 6
  • 58
  • 107
  • 3
    Requiring Referer headers is privacy-unfriendly and violates the HTTP spec, which says that browsers are free not to send the Referer and sites should continue to work. Also, a small fraction of firewalls or middleboxes remove Referer headers, so this may break for some users. I don't recommend requiring Referer headers. – D.W. Jun 03 '11 at 01:22
2

Don't bother going for CAPTCHA, that's close to worthless.
Instead implement a rational rate throttling mechanism - i.e. limit the number of "contact us" requests that can arrive from a single IP address within a given amount of time, for sensible values of "number" and "amount of time". E.g. no more than 10 contact requests a minute, IF that makes sense - or 5 every hour, if THAT makes sense.

Of course you can't limit it TOO tightly, given the fact that IP addresses don't necessarily map one-to-one to users (DHCP, dynamic IP e.g. via dialup, corporate proxy, etc), so you need to leave some wiggle room there - but on the other hand, 2000 requests every minute is far too much.
You can't implement the throttle per session, since that is easy to reset.

Consider also implementing a global throttle, for any address, with a much higher rate. This might cause some form of DoS, but then again I assume the availability of the contact form is not critical....

AviD
  • 72,138
  • 22
  • 136
  • 218
  • Availability of the contact form is not critical, but availability of the websites in which it is contained, and even more importantly, the database that runs these and other websites IS critical. What do you mean by global throttle? Are you basically saying to not allow x number of total requests per minute/hour? – Eric Belair Jun 07 '11 at 04:13
  • 1
    @Eric, yes that is it exactly. The assumption being that you'd rather block the contactus functionality, than have other (email? service desk?) systems get over-flooded. – AviD Jun 07 '11 at 08:13