[pmwiki-users] Google local site search
design at softflow.co.uk
Wed Dec 28 14:15:37 CST 2005
Tuesday, December 27, 2005, 12:48:20 AM, Patrick wrote:
> Newer versions of PmWiki (since 2.1.beta8) automatically return
> "403 Forbidden" errors to robots for any action other than
> ?action=browse, ?action=rss, or ?action=dc. However, over the
> years Google has built up a fairly large cache of PmWiki pages,
> so they'll likely continue to appear in Google's search results
> until they are somehow expired from Google's database.
Can this method be extended to other parameters apart from ?action= ?
It would be nice to have an array where a skin author can add skin
specific parameters, like ?setskin= ?setcolor= ?setlayout= , so any
links with these will result in the same "403 Forbidden" error for
robots as undesired actions.
And if $EnableRobotCloakActions is set, such links will be cloaked
likewise, if that is possible.
In fact, should robots not be allowed to follow only those links with
specific actions (?action=browse, ?action=rss, or ?action=dc) and be
turned away from any other links using parameters, not just other
actions (with exception of ?n=pagename)?
More information about the pmwiki-users