[pmwiki-users] Too many pmwiki.php processes
ABClf
languefrancaise at gmail.com
Wed Feb 24 08:27:38 CST 2016
>From my naive point of view, 2500 visitors (or page views) a day,
printing out light pages, should be easily managed by a pmwiki
website.
Of course, a robot asking to many pages in a short time might be a
problem. For robots, you can try to adjust robot text (craw-delay ?)
You might try to adjust the meta-tag REVISIT-AFTER as well.
Are your websites getting more visitors now than before, have you
reached some limit ? is your host restrictive ?
No matter what, to my mind, FastCache is still a relevant extension to
set up. Its purpose is to help processor and fasten the page creation.
Very useful if most of your visitors are not editors.
Gilles.
2016-02-24 14:23 GMT+01:00 Rick Cook <rick at rickcook.name>:
> They were complaining about too many processes and claiming it was putting too much load on their server.
>
> The PmWiki sites involved are very simple with no dynamic content.
>
> Most of the "visitors" are robots of various varieties. The most access_log entries for the year so far across my whole account (~10 PmWiki sites and 2 WordPress sites) was about 9200 on one day. Typically, it was more like 2500 a day. Since this all started, I have put a more restrictive robots.txt file in all of the URL roots and added 192.168 to all of the appropriate .htaccess files (somehow, the non-routeable address 192.168.151.0 was one of the more frequent IP addresses in my access_log files). I think it is running more like 1000 per day now.
>
> They suggested several IP addresses to block. One was the dynamic IP assigned to my home connection and another was their site monitoring service. Most of the other addresses were attached to well known sites like Google.
>
>
> Rick Cook
>
>> On Feb 24, 2016, at 05:59, ABClf <languefrancaise at gmail.com> wrote:
>>
>> Why are they complaining : because "too many" (quantitative) or
>> because "too heavy" (your sites cannibalize the server) ?
>>
>> Maybe you use too heavy pagelists ? What I mean is X processes doesn't
>> tell us how much cpu eager they are (printing out a simple page vs
>> building a complex page from multiparameter pagelist). How many users
>> visiting your sites in the busy hours ?
>>
>> In that case, if possible, you might be happy with fastcache recipe.
>> That's a must have for me as I use pagelists and ptv a lot. It works
>> nicely (40 visitors on the busy hours).
>> I never delete all cached pages ; I run a cron which delete 300 oldest
>> cached pages every day.
>>
>> I have been in trouble with my host recently, when my pageindex was
>> lost for unknown reason : hard to rebuild one, process was taking all
>> cpu for minutes. Thus they blocked my site for a few hours. (Shared
>> host, "premium option", 2 cores, 2 go ram, 20 go ssd...).
>>
>> My host didn't give me useful informations I can read to try to
>> understand what was going wrong ; just told me : run htop to see the
>> processes and cpu/memory usage.
>>
>> Gilles.
>>
>> 2016-02-14 11:13 GMT+01:00 Rick Cook <rick at rickcook.name>:
>>> All,
>>>
>>> My hosting provider is complaining about "too many pmwiki.php processes" running from my PmWiki sites. By "too many", they meant more than 100. I have 10 or so sites active with this provider. With that many sites, having a 100 or more pmwiki.php processes doesn't seem excessive.
>>>
>>> Has anyone else had this type complaint from their hosting service?
>>>
>>>
>>> Thanks,
>>>
>>> Rick Cook
>>> _______________________________________________
>>> pmwiki-users mailing list
>>> pmwiki-users at pmichaud.com
>>> http://www.pmichaud.com/mailman/listinfo/pmwiki-users
More information about the pmwiki-users
mailing list