[pmwiki-users] Google: noindex, nofollow error 403

pmwiki at 911networks.com pmwiki at 911networks.com
Mon Apr 19 15:14:51 CDT 2010


I'm having a problem with a website with 591 errors with 250 pages:
They are all similar. I display some SQL source code with GeShi to

The I get for the sourceblock an error 403 for the link to the
sourceblock. such as:


Is there a way of solving this though either a nofollow for each of
these sourceblocks or stopping it at the robots.txt?

I have found an old email from PM saying:


For PmWiki, this means we cannot use the robots standards to advise a
robot "don't request pages with ?action=edit" or "don't request any
url that ends with 'RecentChanges'".  Thus, robots that are
"well-behaved" with respect to the above standards are still going to
make requests that we would prefer they avoid.  Thus the best we can
do is intercept the requests as quickly as possible when they do
occur and deny them before they consume more resources than they
already have.  This is what PmWiki does by default.

Now then, advising a robot what it may request, and advising a
robot what to do with the contents of a response we send back 
are two different things.  The best example of this is 
<meta name='robots' content='noindex' />, which tells
search engine spiders that they should not index the contents
of the current page, but it's okay to follow the links on
the page to other resources (unless excluded by robots.txt).


I can't do a <meta name='robots' content='noindex' /> to the
sourceblock because it's not a separate page.

OR how can I disable globally the $[GetCode] ?


More information about the pmwiki-users mailing list