vpg / disturb

Distributed Workflow Processor
MIT License
11 stars 3 forks source link

refs #97 Adding first Worker UT #104

Closed jrmbrgs closed 6 years ago

coveralls commented 6 years ago

Coverage Status

Coverage increased (+12.09%) to 75.0% when pulling 1676988c78355435a843ec2d9882670ddf6dc367 on worker_ut into 5af35d6cd7a5cf626a74065d6521554133369594 on alpha.

TravisBuddy commented 6 years ago

Travis tests have failed

Hey Jérôme Bourgeais, Please read the following log in order to understand the failure reason. It'll be awesome if you fix what's wrong and commit the changes.

1st Build

vendor/phpunit/phpunit/phpunit -c Tests/phpunit.xml ``` PHPUnit 6.5.5 by Sebastian Bergmann and contributors. Runtime: PHP 7.1.11 with Xdebug 2.5.5 Configuration: /home/travis/build/vpg/disturb/Tests/phpunit.xml 2018-01-18 10:26:51 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Client/config.json' 2018-01-18 10:26:51 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:51 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:52 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .....2018-01-18 10:26:52 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-18 10:26:52 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:52 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ......2018-01-18 10:26:52 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ...2018-01-18 10:26:52 [INFO] Connecting to Elastic "https:\/\/badhost" 2018-01-18 10:26:52 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ......2018-01-18 10:26:53 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Core/Storage/Config/validWorkflowConfig.json' 2018-01-18 10:26:53 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:53 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-MissingStorageAdapter.json' .2018-01-18 10:26:53 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-WrongStorageAdapter.json' .2018-01-18 10:26:53 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-WrongStorageAdapterConfig.json' .2018-01-18 10:26:53 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Core/Storage/Config/validWorkflowConfig.json' .......2018-01-18 10:26:53 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-18 10:26:53 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:54 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-18 10:26:54 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:54 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-18 10:26:54 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ...2018-01-18 10:26:54 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-18 10:26:54 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/parallelized.json' 2018-01-18 10:26:54 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:54 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:54 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:54 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:54 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:54 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" [2018-01-18T10:26:55,058][WARN ][o.e.a.u.UpdateHelper ] [yzbuC3w] Used upsert operation [noop] for script [ def nbStep = ctx._source.steps.size(); def jobHash = ['reservedBy':params.workerCode, 'executedOn':params.workerHostname]; // loop over steps for (int stepIndex = 0; stepIndex < nbStep; stepIndex++) { def step = ctx._source.steps[stepIndex]; // if its a parrallelized steps node, loop over each if (step instanceof List) { def nbParallelizedStep = step.size(); for (int parallelizedStepIndex= 0; parallelizedStepIndex< nbParallelizedStep; parallelizedStepIndex++) { // if the given step is found, look for the given job if (step[parallelizedStepIndex].name == params.stepCode) { def nbJob = step[parallelizedStepIndex]['jobList'].size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = step[parallelizedStepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already reserved : noop if (job.containsKey('reservedBy')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex][parallelizedStepIndex]['jobList'][jobIndex] .putAll(jobHash); break; } } break; } } } else if (step.name == params.stepCode) { def nbJob = step.jobList.size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = ctx._source.steps[stepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already reserved : noop if (job.containsKey('reservedBy')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex]['jobList'][jobIndex].putAll(jobHash); break; } } } }], doing nothing... .2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" [2018-01-18T10:26:55,435][WARN ][o.e.a.u.UpdateHelper ] [yzbuC3w] Used upsert operation [noop] for script [ int nbStep = ctx._source.steps.size(); def jobHash = ['status':params.jobStatus, 'finishedAt':params.jobFinishedAt, 'result':params.jobResult]; // loop over steps for (int stepIndex = 0; stepIndex < nbStep; stepIndex++) { def step = ctx._source.steps[stepIndex]; // if its a parrallelized steps node, loop over each if (step instanceof List) { int nbParallelizedStep = step.size(); for (int parallelizedStepIndex= 0; parallelizedStepIndex< nbParallelizedStep; parallelizedStepIndex++) { // if the given step is found, look for the given job if (step[parallelizedStepIndex].name == params.stepCode) { def nbJob = step[parallelizedStepIndex]['jobList'].size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = step[parallelizedStepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already finalized : noop if (job.containsKey('finishedAt')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex][parallelizedStepIndex]['jobList'][jobIndex] .putAll(jobHash); break; } } break; } } } else if (step.name == params.stepCode) { int nbJob = step.jobList.size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = ctx._source.steps[stepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already finalized : noop if (job.containsKey('finishedAt')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex]['jobList'][jobIndex].putAll(jobHash); break; } } } }], doing nothing... .2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:55 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:26:56 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:26:57 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-18 10:26:57 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" EE...... 53 / 53 (100%) Time: 5.48 seconds, Memory: 12.00MB There were 2 errors: 1) Tests\Library\Context\ManagerWorkerTest::testStartWorkflow Error: Class 'Vpg\Disturb\Core\Worker\Cli\Console' not found /home/travis/build/vpg/disturb/Library/Core/Worker/AbstractWorker.php:113 /home/travis/build/vpg/disturb/Tests/Library/Workflow/ManagerWorkerTest.php:64 2) Tests\Library\Context\ManagerWorkerTest::testBadInitClientClass Error: Class 'Vpg\Disturb\Core\Worker\Cli\Console' not found /home/travis/build/vpg/disturb/Library/Core/Worker/AbstractWorker.php:113 /home/travis/build/vpg/disturb/Tests/Library/Workflow/ManagerWorkerTest.php:113 ERRORS! Tests: 53, Assertions: 97, Errors: 2. Generating code coverage report in Clover XML format ... done ```


TravisBuddy commented 6 years ago

Travis tests have failed

Hey Jérôme Bourgeais, Please read the following log in order to understand the failure reason. It'll be awesome if you fix what's wrong and commit the changes.

1st Build

vendor/phpunit/phpunit/phpunit -c Tests/phpunit.xml ``` PHPUnit 6.5.5 by Sebastian Bergmann and contributors. Runtime: PHP 7.1.11 with Xdebug 2.5.5 Configuration: /home/travis/build/vpg/disturb/Tests/phpunit.xml 2018-01-18 10:34:36 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Client/config.json' 2018-01-18 10:34:36 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:36 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:37 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .....2018-01-18 10:34:37 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-18 10:34:37 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:37 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ......2018-01-18 10:34:37 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ...2018-01-18 10:34:37 [INFO] Connecting to Elastic "https:\/\/badhost" 2018-01-18 10:34:37 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ......2018-01-18 10:34:37 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Core/Storage/Config/validWorkflowConfig.json' 2018-01-18 10:34:37 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:37 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-MissingStorageAdapter.json' .2018-01-18 10:34:37 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-WrongStorageAdapter.json' .2018-01-18 10:34:37 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-WrongStorageAdapterConfig.json' .2018-01-18 10:34:37 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Core/Storage/Config/validWorkflowConfig.json' .......2018-01-18 10:34:37 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-18 10:34:37 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:39 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:39 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ...2018-01-18 10:34:39 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-18 10:34:39 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/parallelized.json' 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" [2018-01-18T10:34:39,886][WARN ][o.e.a.u.UpdateHelper ] [HucpFd0] Used upsert operation [noop] for script [ def nbStep = ctx._source.steps.size(); def jobHash = ['reservedBy':params.workerCode, 'executedOn':params.workerHostname]; // loop over steps for (int stepIndex = 0; stepIndex < nbStep; stepIndex++) { def step = ctx._source.steps[stepIndex]; // if its a parrallelized steps node, loop over each if (step instanceof List) { def nbParallelizedStep = step.size(); for (int parallelizedStepIndex= 0; parallelizedStepIndex< nbParallelizedStep; parallelizedStepIndex++) { // if the given step is found, look for the given job if (step[parallelizedStepIndex].name == params.stepCode) { def nbJob = step[parallelizedStepIndex]['jobList'].size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = step[parallelizedStepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already reserved : noop if (job.containsKey('reservedBy')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex][parallelizedStepIndex]['jobList'][jobIndex] .putAll(jobHash); break; } } break; } } } else if (step.name == params.stepCode) { def nbJob = step.jobList.size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = ctx._source.steps[stepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already reserved : noop if (job.containsKey('reservedBy')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex]['jobList'][jobIndex].putAll(jobHash); break; } } } }], doing nothing... .2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:39 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" [2018-01-18T10:34:40,301][WARN ][o.e.a.u.UpdateHelper ] [HucpFd0] Used upsert operation [noop] for script [ int nbStep = ctx._source.steps.size(); def jobHash = ['status':params.jobStatus, 'finishedAt':params.jobFinishedAt, 'result':params.jobResult]; // loop over steps for (int stepIndex = 0; stepIndex < nbStep; stepIndex++) { def step = ctx._source.steps[stepIndex]; // if its a parrallelized steps node, loop over each if (step instanceof List) { int nbParallelizedStep = step.size(); for (int parallelizedStepIndex= 0; parallelizedStepIndex< nbParallelizedStep; parallelizedStepIndex++) { // if the given step is found, look for the given job if (step[parallelizedStepIndex].name == params.stepCode) { def nbJob = step[parallelizedStepIndex]['jobList'].size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = step[parallelizedStepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already finalized : noop if (job.containsKey('finishedAt')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex][parallelizedStepIndex]['jobList'][jobIndex] .putAll(jobHash); break; } } break; } } } else if (step.name == params.stepCode) { int nbJob = step.jobList.size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = ctx._source.steps[stepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already finalized : noop if (job.containsKey('finishedAt')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex]['jobList'][jobIndex].putAll(jobHash); break; } } } }], doing nothing... .2018-01-18 10:34:40 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:40 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:40 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:40 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:40 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:40 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-18 10:34:41 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-18 10:34:42 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-18 10:34:42 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" EE...... 53 / 53 (100%) Time: 5.16 seconds, Memory: 12.00MB There were 2 errors: 1) Tests\Library\Context\ManagerWorkerTest::testStartWorkflow Error: Class 'Vpg\Disturb\Core\Worker\Cli\Console' not found /home/travis/build/vpg/disturb/Library/Core/Worker/AbstractWorker.php:113 /home/travis/build/vpg/disturb/Tests/Library/Workflow/ManagerWorkerTest.php:64 2) Tests\Library\Context\ManagerWorkerTest::testBadInitClientClass Error: Class 'Vpg\Disturb\Core\Worker\Cli\Console' not found /home/travis/build/vpg/disturb/Library/Core/Worker/AbstractWorker.php:113 /home/travis/build/vpg/disturb/Tests/Library/Workflow/ManagerWorkerTest.php:113 ERRORS! Tests: 53, Assertions: 97, Errors: 2. Generating code coverage report in Clover XML format ... done ```


coveralls commented 6 years ago

Coverage Status

Coverage increased (+12.09%) to 75.0% when pulling 232170ad093ddffd605255bbc3c65fe1296e8816 on worker_ut into 342ab55fbbf018ea0ff330ca90822d1a598ad65d on alpha.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+13.2%) to 76.123% when pulling e2797fe52332dec84d3720444244d4efae54f5f8 on worker_ut into 342ab55fbbf018ea0ff330ca90822d1a598ad65d on alpha.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+14.04%) to 76.95% when pulling eb2bbf46d178402d322f58832ebd050dbb8f6e7a on worker_ut into 342ab55fbbf018ea0ff330ca90822d1a598ad65d on alpha.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+19.8%) to 82.742% when pulling 04d1e541f323c94909164e0a6ed2e6c9a150436c on worker_ut into 342ab55fbbf018ea0ff330ca90822d1a598ad65d on alpha.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+19.8%) to 82.742% when pulling e091fdc962adc4af84483a4e3736bb58e9a62cb1 on worker_ut into 342ab55fbbf018ea0ff330ca90822d1a598ad65d on alpha.