Closed mary-kate closed 8 years ago
I don't think any maintenance scripts run automatically as they should. I made a cron job for updateSpecialPages.php
to update that but I've never had to do that on other wiki installations and they automatically updated. Something to investigate.
We probably disabled jobs running on pageloads so they never get run, something like that
runJobs eventually finished after having to restart it at least N times a day and now the job queue length is somewhat more acceptable:
jack@brickimedia:~$ WIKI=en php /var/www/core/maintenance/showJobs.php
PHP: syntax error, unexpected $end, expecting ']' in /etc/php5/cli/conf.d/browscap.ini on line 57
PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525+lfs/memcached.so' - libmemcached.so.10: cannot open shared object file: No such file or directory in Unknown on line 0
1
For good measure, I also cleared out all the other wikis' job queues...but really, let's set up a cron job or something for this. I dunno if a daily one would be too much, but at least a weekly one should be doable.
Ran
showJobs.php
on en, was not amused to see over 166k entries queued (seems that all of the queued jobs arw of the following types: EchoNotificationDeleteJob, refreshLinks, SMW\UpdateJob, htmlCacheUpdate). Now runningrunJobs.php
in order to try to clear it up a bit, but really, it should be way more automated.