[pmwiki-users] Trouble with .pageindex when too much _new_ data to index (+ sqlite)

ABClf languefrancaise at gmail.com
Fri Feb 27 15:38:58 CST 2015


Peter,

i'm still (desesperatly) fighting with pmwiki (have tried Wikimedia wich
appears to be a crazy inflationist monster : give him 100 mo of data and it
will make them 400 mo in its complicated database (it's a heavy thing
working at the end) !!).

For information,
last try was to post new pmwiki, sqlite recipe, setup config, and post
sqlite database (100mo) somewhere on my website (1&1 shared host) ; then
ask for Site.Reindex page.

I'm getting this sqlite recipe related error at the very beginning (just
after count was done ; nothing created in wiki.d) :

DEBUG: count=173687

Fatal error: Call to a member function fetchAll() on a non-object in
/homepages/18/d269604285/htdocs/dev4/cookbook/sqlite.php on line 403

Gilles.



2015-01-29 14:03 GMT+01:00 Peter Bowers <pbowers at pobox.com>:

>
> On Wed, Jan 28, 2015 at 10:10 PM, ABClf <languefrancaise at gmail.com> wrote:
>
>> Is there something to do with the native search engine to avoid it
>> failing each time amount of new data is too big ?
>
>
> Try reindexing by multiple re-loads of the page Site.Reindex using the
> following as the contents of Site.Reindex.php:
>
> ===(snip)===
> <?php
>
> # NOTE: For this to work it expects wiki.d/.reindex to be deleted before
> # starting the reindex. Then simply reload this page multiple times until
> # you get the message that the reindexing process is complete
>
> #global $PageIndexTime, $WikiDir, $FarmD;
> include_once("$FarmD/scripts/stdconfig.php");
> include_once("$FarmD/scripts/pagelist.php");
>
> SDV($ReindexFile, "$WorkDir/.reindex");
> #echo "DEBUG: Attempting to delete $ReindexFile<br />\n";
> #unlink($ReindexFile);
>
> set_time_limit(120);
> $PageIndexTime = 60;
> $fp = @fopen($ReindexFile, "r");
> if (!$fp) { // no .pageindex - start from scratch
>     echo "DEBUG: A<br />\n";
>     $pagelist = $WikiDir->ls();
>     sort($pagelist);
>     file_put_contents($ReindexFile, implode("\n", $pagelist));
>     fixperms($ReindexFile);
> } else { // we are assuming .pageindex has been created in order
>     echo "DEBUG: B<br />\n";
>     $pagelist = explode("\n", file_get_contents($ReindexFile));
>     $lastpage = '';
>     $ifp = @fopen($PageIndexFile, 'r');
>     if ($ifp) {
>         while (!feof($ifp)) {
>           $line = fgets($ifp, 4096);
>           while (substr($line, -1, 1) != "\n" && !feof($ifp))
>             $line .= fgets($ifp, 4096);
>           $i = strpos($line, ':');
>           if ($i === false) continue;
>           $n = substr($line, 0, $i);
>           if ($n > $lastpage) $lastpage = $n;
>           else break;
>         }
>         fclose($ifp);
>         for ($i = 0; $i < sizeof($pagelist); $i++)
>             if ($pagelist[$i] >= $lastpage) break;
>         if ($pagelist[$i] == $lastpage)
>             $pagelist = array_slice($pagelist, $i+1);
>     }
> }
> echo "DEBUG: count=".count($pagelist)."<br />\n";
> if (!count($pagelist)) {
>     echo "Indexing complete. Deleting $ReindexFile<br />\n";
>     if (file_exists($ReindexFile)) {
>         fixperms($ReindexFile);
>         unlink($ReindexFile); // for some reason this is giving err in
> windows
>     }
> }
> PageIndexUpdate($pagelist);
> ===(snip)===
>
> I've only done a bare minimum of testing so there will probably be
> issues.  If you get it working it may be worthwhile to package it as a
> recipe for others who may want an explicit reindex.
>
> -Peter
>



-- 

---------------------------------------
| A | de la langue française
| B | http://www.languefrancaise.net
| C | languefrancaise at gmail.com
---------------------------------------
       @bobmonamour
---------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.pmichaud.com/pipermail/pmwiki-users/attachments/20150227/996beafa/attachment.html>


More information about the pmwiki-users mailing list