[pmwiki-users] Include an external pmwiki page that is accessible via a URL.

Bo Peng ben.bob at gmail.com
Sat Jan 17 11:13:13 CST 2009

On Sat, Jan 17, 2009 at 10:16 AM, Bo Peng <ben.bob at gmail.com> wrote:
>> 1.  Set up a svn repository for the core documents.
>> 2.  On the PmWiki site, install the "ImportText" recipe.
>> 3.  On the PmWiki site, check out a copy of the svn repository
>>    and set the ImportText recipe to import from that directory.
>> 4.  Set a cron job or some other mechanism to have the svn
>>    directory perform "svn update" and retrieve the latest
>>    pages from the svn repository.
> The only problem is that sourceforge.net does not seem to support cron
> jobs. Maybe pmwiki can do something like wget/rsync/svn update for me?

I have very little experience with php/pmwiki. When I read the
ImportText recipe
(http://www.pmwiki.org/pmwiki/uploads/Cookbook/import.php), I am
wondering if something like the following can be implemented

# places to put cached webpage
$importCache = 'importedURL'
# How often to re-read these URLs
$URLImportFreq = 3600;

This recipe keeps track of the content of each URL and the last time
each URL has been imported, like what has been done in ImportText.

In a page, something like
(:importURL http://blah1.net/page1.txt)

will trigger an update, which will retrieve the content of the URL if
the page does not exist, or if it is out of date.

Is this conceptually doable, and potentially useful?


More information about the pmwiki-users mailing list