This morning we had a crawler going nuts on our server hitting our site almost 100 times per second.
We'd like to add a protection for this.
I guess I'' have to use HttpLimitReqModule but I don't want to block allow google/bing/... How should I do it ?