Wiki-Spammers (was: [Pmwiki-users] pmwiki.org - version 1 - vandalised)

Steffen Glückselig steffen
Thu Jan 13 14:11:11 CST 2005


Hello,

> My idea was to combine a captcha with the X unapproved external links
> concept. So you'd only have to decipher the captcha if you post more
> than X links.
Then a page with many links must be approaved on every edit?

>> I think the solution using a blacklist is quite convenient for the
>> wiki-user (he does not notice at all, usually) and does work quite good.
> I don't like blacklisting, because the chance that you hit real users is
> somewhat big. E.g. I am getting a dynamic IP on every connect and since
> I am connected to one of Germans biggest providers I get a lot of
> different IPs.
I was not referring to blacklists of IPs but of words that must not be  
contained within some text. That list could be extended by IPs, though.  
Just like in the Blacklist-script I was referring to.
One could have more sophisticated mechanisms using scores over the word in  
the blacklist.
Instead or additionally to blacklist one could use bayesian networks[1]  
that could learn what texts can be allowed to be saved and which should be  
abandoned.

> In addition it might not even be too difficult to connect with some
> spoofed IPs from a somewhat different network...
You are right. Therefore, I would not rely on the blacklisted IPs too  
much. Such lists must be cleared regularly and are for short term use  
only, therefore.


best regards
Steffen

[1] http://the.taoofmac.com/space/Bayesian



More information about the pmwiki-users mailing list