search engine beware! Was: Re: [pmwiki-users] notice of current edit

Joachim Durchholz jo at
Fri Apr 15 12:20:05 CDT 2005

Radu wrote:

> At 10:03 AM 4/15/2005, Joachim Durchholz wrote:
>>> Since no sane individual can see two different pages in the same
>>> second,
>> Then count me among the insane. (Well, that might be accurate
>> actually *ggg*)
>> For examples, sometimes I right-click large bundles of links for
>> "open in new tab"; the click rate is about 2 or 3 per second.
> Wow, you're fast. So you open them for editing first, then start
> editing later? Or symultaneously?

It's not that common that I edit all these pages, but it can happen.
Though I'd be hard pressed to indeed start editing more than one page
per second - that was click rates for retrieving them.

Still, Pm's observation holds: sometimes, a bunch of packets is held up,
and edit requests started at different times come in at the same second
on the server. And I do occasionally edit multiple pages at a time - for
example if they all touch on the same subject.

> OK, then my idea won't work for you. As I mentioned this was for a
> Project Management context, where you probably know who's working and
> you could refrain from opening several pages per second.

Dunno - I can easily imagine scenarios where multiple pages need editing
at the same time. Not sure about a concrete scenario in project
management, but I'm pretty sure somebody will come up with one, given
enough time and practice.

What I'm pretty sure is that there are occasions where people want to
*save* several pages, with as little latency between saves as possible.
Considerations like keeping a set of pages consistent play a role here.

>> Web crawlers already have counteracted that measure. wget, for 
>> example, has options to set arbitrary intervals when in 
>> "suck-the-site" mode.
> Hmmmm... That's a toughie. So you're right, no need to go to these 
> lengths. I suppose the project site would be pasworded anyway, so 
> crawlers won't trip the "This page is being edited" note.

If it's password-protected, crawlers aren't an issue anyway.

In general, I think crawlers are far less of a problem than spammers.


More information about the pmwiki-users mailing list