[pmwiki-users] Understanding Robots control

Tegan Dowling tmdowling at gmail.com
Wed Nov 14 08:15:02 CST 2007


>From material way down in the Discussion area on
http://www.pmwiki.org/wiki/Cookbook/ControllingWebRobots, I gather
that the default treatment of off-site links is to instruct robots not
to follow them.  None of my wikis is open to editing by unauthorized
authors, so I have no problems with link-spamming (aside from an
occasional entry in a comment box).  I do want the sites linked-to
from mine to have the search engine benefit of those links, so I want
to be sure that robots are allowed to follow those links.

Hagan Fox added the following to the page back in June:
---------------
For PmWiki 2.2, here's something you can use if you want to allow
robots to follow links to external sites and avoid wasting bandwidth
by having robots blindly follow links to unimportant wiki pages.

# Remove the default "rel='nofollow'" attribute for external links.
$UrlLinkFmt = "<a class='urllink' href='\$LinkUrl'>\$LinkText</a>";
# Eliminate forbidden ?action= values from page links returned to robots.
$EnableRobotCloakActions = 1;
---------------

My questions:
1) I'm running 2.1.27.  Will Hagan's code work on my sites?
2) That page is a big mess.  Could someone who actually understands it
please re-work it?  Removing old Q&A and refactoring the answers into
the content, and like that?



More information about the pmwiki-users mailing list