[Oisf-users] Web aspirator detection

Justin Mitchell tcpandip at gmail.com
Thu Nov 3 13:11:04 UTC 2011


Yea, I don't think IDS is the tool of choice for addressing/combating such
activity. Perhaps there is another compelling piece of the puzzle we're
missing.

What are the User-Agents?
Are they not respecting your robots.txt?
Firewall has already been mentioned (even iptables can handle).
If you're using Apache, ModSecurity could address.
Again, if you're using Apache, you might want to take a peek
at mod_bandwidth and mod_limitipconn.
You might also want to check into the reverse proxy with Squid (or your
proxy of choice with the capability).

And, yes, if you insist, an IDS signature could alert you given N
connections over N timeframe. However, this can be very taxing depending on
your parameters.

On Thu, Nov 3, 2011 at 8:49 AM, Martin Holste <mcholste at gmail.com> wrote:

> > I'm looking for a way to detect web aspiration. I'm encountering a lot a
> > simultaneous connexions from single IPs, which are scrawling all our web
> > pages.
>
> That is very normal.  Web spiders from Google, Bing, Baidu, and
> thousands of others will continue to crawl pages, but it shouldn't
> cause a problem.  Why do you want to detect the web crawls?
> _______________________________________________
> Oisf-users mailing list
> Oisf-users at openinfosecfoundation.org
> http://lists.openinfosecfoundation.org/mailman/listinfo/oisf-users
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openinfosecfoundation.org/pipermail/oisf-users/attachments/20111103/ad950178/attachment-0002.html>


More information about the Oisf-users mailing list