[NTLUG:Discuss] A little tidbit some might find useful...

Darin W. Smith darin_ext at darinsmith.net
Wed Feb 12 16:14:30 CST 2003


I run a small website on my machine, which sits behind a little firewall 
(port 80 is forwarded to my machine) and use dynamic dns so I can access it 
all the time.  I also run some of the "default" Mandrake firewalling on the 
web-server box, just to be even safer.  One thing I've noticed is that due 
to the amount of Windoze boxen on AT&T's broadband network, and the fact 
that they really don't care about reducing the amount of CodeRed and Nimda 
*still* running around out there, there is an awful lot of junk that shows 
up in my Apache access_log since those worms keep trying to spread the worm 
to an NT/IIS setup.  They really clutter up the logs.  Worse, they make 
your box generate 404 pages.  This uses small amounts of cpu and bandwidth. 
 To combat that, I initially tried putting the following in my base 
.htaccess file:
redirect /scripts http://www.stoptheviruscold.invalid
redirect /MSADC http://www.stoptheviruscold.invalid
redirect /c http://www.stoptheviruscold.invalid
redirect /d http://www.stoptheviruscold.invalid
redirect /_mem_bin http://www.stoptheviruscold.invalid
redirect /msadc http://www.stoptheviruscold.invalid
RedirectMatch (.*)\cmd.exe$ http://www.stoptheviruscold.invalid$1

which I had snarfed off of a newsgroup a while back.

Unfortunately, you still get all the log pollution, so I found a script a 
guy had written on MacOS X that snarfed out of access_log the IP's of the 
infested hosts, and blocked all traffic from them using the firewall.  That 
seemed a slight bit extreme, as it never would allow re-opening to no- 
longer-infected hosts.

So, I took a little more refined approach, though it still could be 
improved.  It would be possible with a bit more work to be a little more 
elegant than my "flush" method of re-opening everything and just re- 
building the list, but this really seems to work well for me, so I figured 
I'd pass it on...

Here is a little script I wrote to use iptables to block traffic from those 
sites:

#!/bin/sh
# block_worms.sh - a script to search apache logs for evidence of
# certain worms accessing webserver and block all http traffic
# from the source ip.  If run regularly, and the access_log is
# rotated regularly, it will dynamically add to the list of
# blocked hosts as it is run, but it must re-build the entire
# list everytime the access_log file is rotated.  Thus, it is
# best to run this script much more often than you rotate access_log.
# It is also good to rotate access_log more often than once a month.
# I run this script hourly, and rotate access_log once a week.
# Copyright (c) Darin W. Smith, 2003
# You may redistribute and use this as you see fit...I don't
# guarantee it for anything, but it seems to work for me...
#
# Not the "smartest" solution, but one that works.  access_log has
# reduced in size by about 70% in my case.
#
# I install this script in /usr/local/sbin
#
# I run this script hourly, with the following in root's crontab:
# 15 * * * * /usr/local/sbin/block_worms.sh
#
#
# Supports finding CodeRed and Nimda
# 2/4/03 - dws - added support for detecting scans for Sun Cobalt servers

if [ -w /var/log/httpd/worm_infested_hosts ] ; then
	/bin/rm -f /var/log/httpd/worm_infested_hosts;
fi

/bin/egrep -i "(cmd.exe|root.exe|default.ida|_vti_bin|cobalt-images)" 
/var/log/httpd/access_log | /bin/awk '{print $1}' | /bin/sort -n | 
/usr/bin/uniq > /var/log/httpd/worm_infested_hosts

/sbin/iptables -F wormsites

/bin/cat /var/log/httpd/worm_infested_hosts |
while read host
do
	/sbin/iptables -A wormsites -s $host -j DROP
	echo "Adding $host to list of blocked sites..."
done

exit 0


To use it, I put the following lines at the end of my /etc/rc.d/rc.local 
file:

#dws - custom firewall rules to block Nimda, CodeRed, and Cobalt-scans
iptables -N wormsites
iptables -I INPUT -p tcp --dport 80 -j wormsites
iptables -I INPUT -p udp --dport 80 -j wormsites
/usr/local/sbin/block_worms.sh


-- 
D!
Darin W. Smith
AIM: JediGrover





More information about the Discuss mailing list