In an article on his web site, Paul Graham discusses about filters that fights back. Filters that fights back are filters that simply fetch all the URL of the spam messages, artificially increasing the traffic on the spammers sites and their cost of operation. Graham says:
"...following all the urls in a spam would have an amusing side-effect. If popular email clients did this in order to filter spam, the spammer's servers would take a serious pounding. The more I think about this, the better an idea it seems. This isn't just amusing; it would be hard to imagine a more perfectly targeted counterattack on spammers."
I decided to rudimentarily implement that idea with a perl script triggered by a cron job.
On my main Linux box running Gentoo, I am using Eric Raymond'sbogofilter to filter out spam. This program put all the spam I am receiving in a maildir named "spam". The script parses the body of these e-mails and extract all the URLs they contain and fetch them, creating "dummy traffic" for these sites. If thousands of people do it, it will mean that spammers will be "victims" of an uncoordinated distributed denial of service attack (uDDoS).
The perl script is:
#! /usr/bin/perl -w
use lib '..', '.';
use Mail::Box::Manager 2.00;
use LWP::Simple;
# Open the folder
my $mgr=Mail::Box::Manager->new;
my $folder = $mgr->open
( "/root/Mail/spam"
, extract => 'ALWAYS' # Take the body
);
die "Cannot open '/root/Mail/spam': $!\n"
unless defined $folder;
# Process all messages in this folder.
my @messages = $folder->messages;
foreach my $message (@messages)
{
my @match=($message->decoded=~/\bhref="(http[^>"]*)">.*/gi);
foreach my $match(@match)
{
print $match,"\n";
my $content="";
my $rc=getstore($match,$content);
}