[pmwiki-users] How much data can a wiki page take? (Was: Bibliographies)
wiemann at ddz.uni-duesseldorf.de
Wed Sep 13 03:07:03 CDT 2006
Patrick R. Michaud schrieb:
> On Tue, Sep 12, 2006 at 04:27:05PM +0200, Bernd Wiemann wrote:
>> Hallo Patrick,
>> is there a possibility to prevent the pageindex-file to index the whole
>> page? Something like $pageindex=20000 or action?=search.summary?
> One can always turn the page index off completely, but I'm not
> sure that works for what you're wanting.
>> My wiki on localhost has about 3000 pages of articles and stories.
>> 60 Mbyte Data + 10 MByte for the pageindex.
>> Because running out of time when searching (the first time, but
>> localhost has to many first times...)
> Once the pageindex is built it shouldn't take that long. Unless,
> of course, you're having to clear the pageindex a lot.
> At any rate, I'm not sure what sorts of limits you're wanting
> to impose on the pageindex; I'd think that if it's only indexing
> part of the page then it might not be of much use for searches and
> the like. But we could try to limit the amount that it indexes.
> Are you just trying to reduce the size of the pageindex file, or
afaik google indexed only the first 50 Kbyte of a page, whatever the
real size is... In all likelihood afterwards are no fundamental new
words or topics that are worth to search about.
> to increase the speed of searches, or ...?
both - and third solving the problem of homonym words.
Within the limits of a short introduction it is much
easier than for a complete article with all the filler
The most important informations are normaly
- summarised in an abstract or
- in a list of keywords or a
Analogue to (:linebreaks:) / (:nolinebreaks:) I could
imagine something like (:pageindex:) / (:nopageindex:)
This construct implies, that searches uses constraint
only the pageindex.
btw. PmWiki is a much better PIM than AskSam ever was,
but best is the possibility to discuss with the
mastermind behind. Thanks so much.
More information about the pmwiki-users