[pmwiki-users] search robot indexing

Robin robin at kallisti.net.nz
Wed Jan 26 08:48:53 CST 2005


On Thursday 27 January 2005 03:29, Hans Bracker wrote:
> I found a bit of info in Cookbook-v1/MetaTag.
> i suppose it is to prevent search-bots from indexing edit and history
> pages. Is the following relevant for pmwiki 2?
afaik, PmWiki 2 puts the noindex,nofollow stuff in on the diff and edit pages 
automagically. I've seen one poorly behaved search engine not respect this, 
but it was badly set up anyway, it didn't have a complete identifier.

> I wonder too if there could be a page in the cookbook about
> PreventingWikiSpamming ? There seem to be various approaches, and
> nothing explaining the differences or a preferred route.
If you use PmWiki 2, there is the urlapprove script, which allows two things:
# puts an upper limit on the number of links allowed in a post (most wiki spam 
is a huge number of links, although I have seen exceptions)
# makes any URL not matching a regex to not be automatically linked, the admin 
has to approve the link.
I have the first one set to 15, and I don't use the second one. I don't get 
enough spam for it to be a problem, and what I usually do get is dealt with 
by the first setting.
I think that there is no equivalent in PmWiki 1 to the urlapprove script, as 
it wasn't quite flexible enough.

-- 
Robin <robin at kallisti.net.nz>             JabberID: <eythian at jabber.org>

Hostes alienigeni me abduxerunt. Qui annus est?

PGP Key 0xA99CEB6D = 5957 6D23 8B16 EFAB FEF8  7175 14D3 6485 A99C EB6D
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
Url : /pipermail/pmwiki-users/attachments/20050127/1766387e/attachment.bin 


More information about the pmwiki-users mailing list