[Oisf-users] Suricata logs perfectly... and now ?!

Martin Holste mcholste at gmail.com
Mon Feb 27 23:46:29 UTC 2012


> Nice to know, tomorrow I will check if they are enabled... Jut one thing: our firewall blocks some subnets so there is never a tcp connection established from some of the IPs.
> Today I enabled the bot and bot-cc rule files and I found a couple of hits I will investigate tomorrow. But they are based on ip traffic and as far as I understand doesn't need a tcp session so they fire when a client tries to establish a connection without success, while the rules based on content won't fire if firewall blocks traffic....

It is for this reason that I recommend NOT blocking IP's which are
known bot controllers.  It is a bit counter-intuitive, but blocking
those IP's only delays the amount of time in which you discover
infected hosts, and if you do not notice the blocks, the C2 domains
will eventually resolve to different, previously unknown, IP's which
are not blocked by your firewall.  Ideally, a sinkhole solution is
setup so that you can both prevent data from being posted to the C2
while still taking advantage of the IDS rules.

> Yes, really ! http.log is fantastic... and grep is not the right tool to handle 1,5GB of daily http.log.

In order to pipe the http.log into syslog (for ELSA), you will need to
configure rsyslog (default on Ubuntu) or syslog-ng to send the file.
This is simple, and I have an example (using Bro logs) on my blog
here: http://ossectools.blogspot.com/2011/09/bro-quickstart-cluster-edition.html
.


> So, now I have the logs, I will have a tool to analize them and... what is missing ? Knowledge about the attacks ? data correlation ?
> For example: an user today triggered a rule about using a "bad" domain. I saw from http.log he/she was on twitter, clicked on a link t.co (the redirector of twitter) then passed through a couple of short-name domains, then on a wear<somethingIdontremember> that redirected to www.google.com...
> Isn't it strange ? How would you investigate such things ? I tried to use wget and got the same results...

The next step I would recommend is setting up StreamDB.googlecode.com,
which is a pcap collector of sorts that I created for this purpose.
You will need to compile Vortex, but that should be straightforward.
See the documentation here on getting started:
http://code.google.com/p/streamdb/wiki/INSTALL .  That will plug into
ELSA (by configuring the pcap_url config variable), so that you can
get instant access to the full transcript of any Snort alert, HTTP
URL, or other log with two clicks and zero seconds.  This will allow
you to fully investigate these kinds of alerts which aren't clear at
first.

If you run into any problems, contact us on the ELSA list over here:
https://groups.google.com/forum/?fromgroups#!forum/enterprise-log-search-and-archive
.



More information about the Oisf-users mailing list