<html>
<body>
<font size=3>At 2006-10-10 07:16 AM -0700, Pico is rumored to have
said:<br>
<blockquote type=cite class=cite cite="">Maybe instead of<br>
deleting the page by replacing the text with the word "delete"
we could<br>
replace the text with "honeypot" or something like that.
It might be<br>
nice if those pages displayed a warning to human authors that this
page<br>
is a honeypot and if you edit it you will be automatically blocklisted.
</font></blockquote><br>
I don't think that a "honeypot" (aka "spam trap" on
mail servers) is a winning proposition for a wiki (see reasons
below).<br><br>
However, if you are convinced that the spammers are robots and they
ignore the existing page content, then why not make it completely
explicit? As in:<br><br>
WARNING: anything you post on this page will be added
to this site's blocklist. This page is a trap for spammers.<br><br>
I would not want to automate adding items posted on the trap page to the
blocklist for a few reasons:<br>
1) a malicious individual would simply post a series of valid URLs,
poisoning the blocklist.<br>
2) a lot of spam includes valid URLs<br>
3) blocking the posting IP is, in my experience, the least effective
blocking method and prone to overkill if it is a proxy address<br>
4) the best method I have found is using URL fragments, usually by
dropping the subdomain portion<br><br>
Spamtraps and IP blocks do work on mail servers for reasons that cannot
be duplicated in the wiki world:<br>
1) sending mail servers should have DNS MX records. If they do not -
block.<br>
2) spam is often send in large CC or BCC lists. If one of the addresses
in the list is a spamtrap, refuse all the mail.<br>
3) there are other "sanity checks" that can be done on the
inbound mail - SPF records, Domain Keys, etc.<br><br>
<x-sigsep><p></x-sigsep>
<font size=3>Neil Herber<br>
Corporate info at
<a href="http://www.eton.ca/" eudora="autourl">http://www.eton.ca/</a>
</font></body>
</html>