[Pmwiki-users] robots
John Rankin
john.rankin
Thu Jan 8 16:24:22 CST 2004
A very helpful tip!
Are there any cases where one *doesn't* want this behaviour?
If not, I'd like to see the code added to PmWiki itself.
JR
On Friday, 9 January 2004 6:43 AM, Christian Ridderstr?m <chr at home.se> wrote:
On Thu, 8 Jan 2004, bram brambring (zonnet) wrote:
> Hi,
>
>
> I have been looking into:
>
> http://www.pmichaud.com/wiki/PmWiki/Robots
>
> and used the depicted rules to exclude the action and search links. However
> some searchengine sites still showed up with the ?action links.
>
> I talked to the owner of one of them, he pointed me to the fact that
> wildcards are not part of the robot exclusion standaard. And although google
> supports wildcards, he is right.
>
> The disallow section must be a complete path starting with '/'
>
> I solved it in the local.php, are there nicer solutions?
>
> if ( $action and ( $action!='browse' ) ) {
> $robots="noindex,nofollow";
> } else {
> $robots="index,follow";
> }
>
> $HTMLTitleFmt = array (
> "<title>\$Titlespaced</title>",
> "<META CONTENT=\"$robots\" NAME=\"robots\">");
>
Thanks for the tip! (I reformatted it like this and it seems to work)
#
# Taking care of robots...
#
if($action and $action!='browse')
$robots="noindex,nofollow";
else
$robots="index,follow";
$HTMLTitleFmt = array ("<title>\$Titlespaced</title>",
"<META CONTENT=\"$robots\" NAME=\"robots\">");
I'd also noticed in the documentation of robots.txt that wildcards aren't
allowed, but forgotten about the whole thing.
/Christian
>
> _______________________________________________
> Pmwiki-users mailing list
> Pmwiki-users at pmichaud.com
> http://pmichaud.com/mailman/listinfo/pmwiki-users_pmichaud.com
>
--
Dr. Christian Ridderstr?m, +46-8-768 39 44 http://www.md.kth.se/~chr
_______________________________________________
Pmwiki-users mailing list
Pmwiki-users at pmichaud.com
http://pmichaud.com/mailman/listinfo/pmwiki-users_pmichaud.com
More information about the pmwiki-users
mailing list