pmwiki-2.0.13: [pmwiki-users] Problem with indirect download of large files
scheibi at gmail.com
Fri Nov 11 01:57:07 CST 2005
2005/11/9, Daniel Scheibler <scheibi at gmail.com>:
> Hello Patrick,
> 2005/11/9, Patrick R. Michaud <pmichaud at pobox.com>:
> > On Fri, Nov 04, 2005 at 01:35:46PM +0100, Daniel Scheibler wrote:
> > > Hello,
> > >
> > > I detect a little problem with indirect downloads.
> > >
> > > I uploaded a large file (>50 MB), maybe xyz.zip, via ftp into the
> > > upload directory of a wiki page. Then I write Attach:xyz.zip into this
> > > wiki page.
> > >
> > > Clicking the resulting link cause a error message in my webserver log file like:
> > > PHP Fatal error: Allowed memory size of 16777216 bytes exhausted
> > > (tried to allocate 10240 bytes) in /path/to/pmwiki/scripts/upload.php
> > > on line 169
> > There are times when PHP really stinks. Apparently PHP's readfile
> > function is reading the entire attachment file into memory before
> > sending it to the browser. This is horribly wasteful of memory and
> > system resources (and, as you've discovered, it breaks when the
> > file size exceeds the memory limit).
> I think although.
> > I'll rewrite PmWiki to read and send the file in smaller chunks,
> > which will resolve this problem.
> Reads as a good solution.
After install pmwiki-2.0.13 and testing a 24 MB file to download I receives
[Fri Nov 11 08:45:17 2005] [error] [client >>IP<<] PHP Fatal error:
Allowed memory size of 16777216 bytes exhausted (tried to allocate
10240 bytes) in /path/to/pmwiki/scripts/upload.php on line 171,
in my webservers error log file.
The new solution to read chunks of 4096 bytes doesn't solve the
problem, because of having the whole file in "echo buffer".
Using a flush() in each while, will send me a 0 byte file.
At the moment I haven't time to look for a solution. In a few weeks
time will be better ;-)
New ideas are welcome.
Daniel Scheibler ========:} student at
eMail: scheibi at gmail.com BTU Cottbus/Germany
More information about the pmwiki-users