Closed ghost closed 9 years ago
Is it still an open issue?
One of our servers has tons of files in its data/jobs/* directories. Not only logs, but also entire books with mp3 files. Fortunately, only failed jobs are kept. Is it configurable, though?
Probably setting up a script in the servers crontab would be the easier approach... you can also check dp2 size -l (The outout format is a bit messy but should be easily parseable using cut )
I'll add a clean command to get rid of errored and done jobs
great :+1:
https://github.com/daisy/pipeline-cli-go/pull/8 The new command looks like:
18:12 javi at disorder clean ✔
~/src/golibs/src/github.com/daisy/pipeline-cli-go/dp2 $ ./dp2 help clean
Usage: dp2 [GLOBAL_OPTIONS] clean [OPTIONS]
Removes the jobs with an ERROR status
Options:
-d,--done Removes also the jobs with a DONE status
enjoy!
From rdeltour@gmail.com on September 25, 2012 15:15:30
As job's data accumulates in the data directory, it would be helpful to support user-configurable "cleanup" cron jobs to regularly remove old data (files + DB entries).
Original issue: http://code.google.com/p/daisy-pipeline/issues/detail?id=218