Disabling Threading in Tcl8.5 in Debian

sguil_logo_h

I’ve been spending the holidays to upgrade some of my own servers. One of them is the Sguil server I use. Until now it ran Debian Squeeze. On Debian Squeeze you could use tcl8.3, which has threading disabled. For Sguil tcl threading needs to be disabled:

ERROR: This version of tcl was compile with threading enabled. Sguil is NOT compatible with threading.

This is a compile time option in TCL, and the Debian Wheezy packages have it enabled by default. Here are the steps to create your own tcl deb with threading disabled:

# apt-get install dpkg-dev
# apt-get install devscripts

Get the tcl8.5 source package and build deps:

# apt-get source tcl8.5
# apt-get build-dep tcl8.5
# cd tcl8.5-8.5.11/

Next, edit the debian/rules file to disable threading. Remove the line:

                      --enable-threads \

Then, build the package:

# debuild -us -uc

And finally install the package:

# cd ..
# dpkg -i ../tcl8.5_8.5.11-2_amd64.deb

I followed this guide here at Debian Administration. It has some more detail on rebuilding debs.

Closing in on Suricata 1.4

I just made Suricata 1.4rc1 available with some pretty exciting features: unix socket mode and IP reputation.

Unix socket

First of all, Eric Leblond’s work on the Unix socket was merged. The unix socket work consists of two parts. The unix socket protocol implementation and a new runmode.

The protocol implementation is based on JSON messages over unix socket. Eric will be fully documenting it soon. Currently the commands are limited to shutting down and getting some basic stats. This part isn’t very exciting yet, but the groundwork for many future extensions has been laid.

The part that is exciting right now, is the unix socket runmode. That this does is start Suricata with all the rules and such, and then it waits for commands on the unix socket. Then the commands will be a pcap filename – log directory pair. This pcap will then be inspected against the rules and the logs go into the log directory supplied. As this can be easily scripted (a python script is provided), it’s a very fast way to test your pcap collections, as the overhead of starting and stopping is skipped.

This may initialy appeal mostly for those of you doing sandnetting and malware analysis, where tens of thousands of pcaps and automatically processed every hour or day, I think this could grow into a feature for a wider audience as well. For example, I could see use in Sguil or Snorby, or pretty much every event manager with full packet capture support, adding an option to scan a pcap associated with an event again. Maybe against _all_ rules, instead of the tuned set running on the live sensors. Maybe you can re-inspect old sessions against the current rules this way to find hits on attacks that were 0-days at the time, etc.

I think there could be many possibilities.

IP Reputation

A slightly more polished version of the code I discussed here is now available in this release. It’s one of those things where it will be very interesting to see how people will put it to use.

Matt Jonkman just wrote some of his ideas to the Emerging Threats mailing list: one of the ideas Matt wrote about is to amend weak rules with reputation data. So if you have a signature that is phrone to false positives, you probably disable it currently. But what if you combine it with reputation data? If the weak rule fires on a sketchy ip, it may be a more reliable alert.

We’ll see how this plays out.

1.4 final

We’re hoping that if nothing big happens, we can do a mid-December 1.4 final release. So please consider running this new release. It’s running very stable on quite a number of places, ISP networks, Lab networks, home networks, sandnetting networks, etc. But we need much more testing to find issues and/or gain confidence that we have found the most important issues. Thanks for helping out!

Upgrading Sguil 0.7.0 to 0.8.0 from CVS

Sguil 0.8.0 was recently released, so it was time for an upgrade. Since I remembered the last major upgrade to be quite a bit of work I wasn’t looking forward to the new upgrade. However, to my surprise it was a breeze. Here is what I did.

On my Sguild server called “owl” — I’d like to think it’s very wise — I first went to my sguil directory, where the CVS checkout lives. There I did a “cvs up”. Next it was time to upgrade the database:

owl:/usr/local/sguil/server/sql_scripts# ./update_0.8.tcl
This script is used for upgrading from Sguil Version 0.7.x
to Sguil Version 0.8.x only.

Use these scripts at your own risk. Be sure to back
up your data before proceeding!!
Do you want to continue? (y/N) y
Database password: ***
Enter path to sguild.users file: /etc/sguild/sguild.users
Connecting to database…Success.
Trying to use database sguildb…Success.
Sguild DB Versions: 0.13
Migrating your password file (/etc/sguild/sguild.users) to the database:
Updating user name sanc
Updating user name victor
Success.

** Finished. The DB has been upgraded. **

owl:/usr/local/sguil/server/sql_scripts#

After this I started the sguild process and it came up just fine. Next I did “cvs up” on all sensors, restarted the agents. A final “cvs up” on my desktop and sguil.tk was updated as well.

The total upgrade took me only a few minutes and I’ve not encountered any regressions. Impressive work by Bamm and his team!

Extracting bad url’s from ModSecurity events in Sguil

Running a PHP based blog, I see a lot of attempts to include code hosted elsewhere in requests. A long time ago I added a simple rule to block one type of the these attempts. A typical attempt looks like this:

GET /blog/category/index.php?page=http://www.djrady.ru/includes/conf.txt?? HTTP/1.1

Notice the trailing questionmarks? Turns out these are always present, so very easy to block on. I’m doing that for a long time now, never seen a single false positive. The rule looks like this:

SecRule ARGS:/.*/ “https?.*?$” “msg:’LOCAL PHP ? link code inclusion attempt’,severity:1,phase:1”

This rule looks at all request args, and checks if their value contains http or https and if it ends with a questionmark. If so, the request is blocked.

Today I was thinking that the URI’s that are included probably contain some badness, and it would be interesting to look what all the URI’s are. Using modsec2sguil I’m adding all ModSecurity events to Sguil, so this was going to be an interesting MySQL challenge!

The query I came up with is this:

SELECT COUNT(*) AS cnt, INET_NTOA(src_ip) AS “Source IP”, trim(LEADING “=” FROM substring_index(substr(unhex(data_payload),locate(‘=http’,unhex(data_payload))), ‘?’, 1)) AS url FROM event INNER JOIN data ON event.sid = data.sid and event.cid = data.cid WHERE (timestamp >= ‘2009-01-13’ AND signature LIKE “MSc 403 LOCAL PHP ?%”) GROUP BY src_ip,url ORDER BY cnt DESC LIMIT 10;

The result is here (click here for full picture):

Bad uri's from Sguil

I get about 10 url’s like this a day, usually they are tried more than once. So what is at these links? The first one gave a 404, so let’s look at the second one. It’s a jpg, thats a picture right? Wrong!

I downloaded the file and opened it in vim. As you can see in this fragment, this is php code…

Bad uri code

Anyone know if there is some place I can report these url’s to on a daily/weekly basis?

SidReporter beta2 released

A little over a week ago the second beta of the SidReporter from Emerging Threats was released (see http://www.emergingthreats.net/content/view/95/1/). I’ve been working with Matt Jonkman to setup this new project at Emerging Threats, mostly in writing the reporter scripts. I think it’s an exciting new project that could provide the community with great information. As Matt wrote on the initial announcement:

“As mentioned a few weeks ago, we’ve been working to bring out tool to anonymously report IDS/IPS hits. Similar to DShield’s firewall log reporting, we believe we can make some incredible data inferences with this information, as well as help improve the quality of our signatures while giving us all feedback to tune our rulesets.

But that’s just the start. As with DShield’s data, I think we’ll run into benefits to the community that we can’t even imagine until we start to look at the data.”

The next step for the reporter is adding support for getting the events from Sguil. Expect to see that soon!

Update to Modsec2sguil

Yesterday the much anticipated Sguil 0.7.0 final was released, as was announced here. I’ve updated Modsec2sguil to support it. Next to this Ryan Cummings sent me a patch for supporting ModSecurity 2.5. So that is included as well. I haven’t given it much testing yet, but works on my boxes.

Get the new release here: http://www.inliniac.net/modsec2sguil/

Thank you Ryan for your contribution!

Deactivating a group of sensors in Sguil 0.7.0-CVS

Recently a site I was using for my Vuurmuur project became unavailable to me. I had two sensors in that site, one Modsec2sguil sensor and a Snort sensor. Since it became unavailable to me, the sensors were all offline and will stay that way. So I wanted to hide them in Sguil, including the net_name group they belonged to, called ‘utrecht’.

Doing this turned out to be quite simple. The sensors have their own table in the database and one of the fields for a sensor is called ‘active’. I figured deactivating the sensors would do it. Deactivating all sensors from the net_name group ‘utrecht’ is done like this:

mysql> UPDATE sensor SET active=”N” WHERE net_name=”utrecht”;

After this, the net_name ‘utrecht’ disappeared from the Sguil client ‘Select Network(s) to Monitor’ screen. However, the ‘Agent Status’ tab in the Sguil client still showed the deactivated agents. This was solved by restarting the Sguil server. So now my ‘Agent Status’ list is clean again!

Sguil 0.7.0 CVS client on HeX 1.0.1

The last few days I’ve been playing with the HeX live-cd. It boots fine on my Lenovo T60 laptop. So after about a minute a nice graphical interface awaits me. I really love the artwork of this project.

There are many security tools installed, including the Sguil client. This is the 0.6.1 version however. As I have written before, I’m running 0.7.0 CVS here, so I needed the 0.7.0 CVS client. Luckily, it’s easy to install.

^^analyzt@raWPacket ~ ->
[HeX]$ cvs -d:pserver:anonymous@sguil.cvs.sourceforge.net:/cvsroot/sguil login
Logging in to :pserver:anonymous@sguil.cvs.sourceforge.net:2401/cvsroot/sguil
CVS password:
cvs login: warning: failed to open /home/analyzt/.cvspass for reading: No such file or directory
^^analyzt@raWPacket ~ ->
[HeX]$ cvs -d:pserver:anonymous@sguil.cvs.sourceforge.net:/cvsroot/sguil co sguil
cvs checkout: Updating sguil
U sguil/README

U sguil/web/lib/geoip.inc
^^analyzt@raWPacket ~ ->
[HeX]$

Before starting the client, remove the sguil.conf in /home/analyzt/, or change the SGUILLIB setting in it. It took me quite some time and help to figure this out. Many thanks to Bamm Visscher and David Bianco! One last thing before starting the client. Remove the /home/analyzt/.sguilrc. If I didn’t I got an error when logging in: “unable to write preferences to /home/analyzt/.sguilrc”. The fonts also looked very ugly and trying to change the font resulted in an error as well.

When starting the client, enterering sguil/client and issuing a ./sguil.tk doesn’t work:

^^analyzt@raWPacket ~ ->
[HeX]$ ./sguil.tk
exec: wish: not found
^^analyzt@raWPacket ~ ->
[HeX]$

The solution to this is simple, just explicity call sguil.tk with the installed wish:

^^analyzt@raWPacket ~ ->
[HeX]$ /usr/local/bin/wish8.4 sguil.tk

Then it works. Next I will try to somehow make sure it can survive a reboot and figure out how to enable the wireless lan.

Sguil 0.7-CVS client on Ubuntu Gutsy

Last week I installed Ubuntu Gutsy on my laptop. I did a clean install, which went fine. Of course, I needed the Sguil client on it as well. Gutsy has all the required libraries in it’s repositories. Install the following packages:

tcl8.4
tclx8.4
tcllib
tk8.4
iwidgets4

Checking out the Sguil client is easy (make sure you have ‘cvs’ installed):

cvs -d:pserver:anonymous@sguil.cvs.sourceforge.net:/cvsroot/sguil login
cvs -d:pserver:anonymous@sguil.cvs.sourceforge.net:/cvsroot/sguil co sguil

After this the client runs fine on my system.