vpg / disturb

Distributed Workflow Processor
MIT License
11 stars 3 forks source link

refs #97 fixing DTO validator #107

Closed jrmbrgs closed 6 years ago

TravisBuddy commented 6 years ago

Travis tests have failed

Hey JΓ©rΓ΄me Bourgeais, Please read the following log in order to understand the failure reason. It'll be awesome if you fix what's wrong and commit the changes.

1st Build

vendor/phpunit/phpunit/phpunit -c Tests/phpunit.xml ``` PHPUnit 6.5.5 by Sebastian Bergmann and contributors. Runtime: PHP 7.1.11 with Xdebug 2.5.5 Configuration: /home/travis/build/vpg/disturb/Tests/phpunit.xml 2018-01-19 08:26:04 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Client/config.json' 2018-01-19 08:26:04 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:04 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:05 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .....2018-01-19 08:26:05 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-19 08:26:05 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:05 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ......2018-01-19 08:26:05 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ...2018-01-19 08:26:05 [INFO] Connecting to Elastic "https:\/\/badhost" 2018-01-19 08:26:05 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ......2018-01-19 08:26:05 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Core/Storage/Config/validWorkflowConfig.json' 2018-01-19 08:26:05 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:05 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-MissingStorageAdapter.json' .2018-01-19 08:26:05 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-WrongStorageAdapter.json' .2018-01-19 08:26:05 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/InvalidWorkflowConfig-WrongStorageAdapterConfig.json' .2018-01-19 08:26:05 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Core/Storage/Config/validWorkflowConfig.json' .......2018-01-19 08:26:05 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-19 08:26:05 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:06 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-19 08:26:06 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:06 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Library/Monitoring/config.json' 2018-01-19 08:26:06 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:06 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-19 08:26:06 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:06 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serieWrongClientClass.json' 2018-01-19 08:26:06 [INFO] Setting consumer group to badfoo .2018-01-19 08:26:07 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-19 08:26:07 [INFO] Setting consumer group to manager 2018-01-19 08:26:07 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:07 [INFO] πŸš€ Starting workflow test0.112923001516350367 2018-01-19 08:26:07 [INFO] Nb job(s) to run for foo : 1 2018-01-19 08:26:08 [INFO] Ask job #0 for test0.112923001516350367 : foo 2018-01-19 08:26:08 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-19 08:26:08 [INFO] Setting consumer group to foo 2018-01-19 08:26:08 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:08 [INFO] messageDto : {"id":"test0.112923001516350367","type":"STEP-CTRL","stepCode":"foo","jobId":"1","action":"start","payload":{"foo":"bar0"}} ...2018-01-19 08:26:10 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-19 08:26:10 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/parallelized.json' 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" [2018-01-19T08:26:10,647][WARN ][o.e.a.u.UpdateHelper ] [tdYtD2a] Used upsert operation [noop] for script [ def nbStep = ctx._source.steps.size(); def jobHash = ['reservedBy':params.workerCode, 'executedOn':params.workerHostname]; // loop over steps for (int stepIndex = 0; stepIndex < nbStep; stepIndex++) { def step = ctx._source.steps[stepIndex]; // if its a parrallelized steps node, loop over each if (step instanceof List) { def nbParallelizedStep = step.size(); for (int parallelizedStepIndex= 0; parallelizedStepIndex< nbParallelizedStep; parallelizedStepIndex++) { // if the given step is found, look for the given job if (step[parallelizedStepIndex].name == params.stepCode) { def nbJob = step[parallelizedStepIndex]['jobList'].size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = step[parallelizedStepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already reserved : noop if (job.containsKey('reservedBy')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex][parallelizedStepIndex]['jobList'][jobIndex] .putAll(jobHash); break; } } break; } } } else if (step.name == params.stepCode) { def nbJob = step.jobList.size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = ctx._source.steps[stepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already reserved : noop if (job.containsKey('reservedBy')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex]['jobList'][jobIndex].putAll(jobHash); break; } } } }], doing nothing... .2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" [2018-01-19T08:26:10,935][WARN ][o.e.a.u.UpdateHelper ] [tdYtD2a] Used upsert operation [noop] for script [ int nbStep = ctx._source.steps.size(); def jobHash = ['status':params.jobStatus, 'finishedAt':params.jobFinishedAt, 'result':params.jobResult]; // loop over steps for (int stepIndex = 0; stepIndex < nbStep; stepIndex++) { def step = ctx._source.steps[stepIndex]; // if its a parrallelized steps node, loop over each if (step instanceof List) { int nbParallelizedStep = step.size(); for (int parallelizedStepIndex= 0; parallelizedStepIndex< nbParallelizedStep; parallelizedStepIndex++) { // if the given step is found, look for the given job if (step[parallelizedStepIndex].name == params.stepCode) { def nbJob = step[parallelizedStepIndex]['jobList'].size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = step[parallelizedStepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already finalized : noop if (job.containsKey('finishedAt')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex][parallelizedStepIndex]['jobList'][jobIndex] .putAll(jobHash); break; } } break; } } } else if (step.name == params.stepCode) { int nbJob = step.jobList.size(); for (int jobIndex = 0; jobIndex < nbJob; jobIndex++) { def job = ctx._source.steps[stepIndex]['jobList'][jobIndex]; if (job.id == params.jobId) { // if job's already finalized : noop if (job.containsKey('finishedAt')) { ctx.op = 'noop'; break; } ctx._source.steps[stepIndex]['jobList'][jobIndex].putAll(jobHash); break; } } } }], doing nothing... .2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:10 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:11 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:12 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:12 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:12 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:12 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:12 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:12 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" .2018-01-19 08:26:12 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-19 08:26:12 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:12 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-19 08:26:12 [INFO] Setting consumer group to manager 2018-01-19 08:26:12 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:12 [INFO] πŸš€ Starting workflow test0.489907001516350372 2018-01-19 08:26:12 [INFO] Nb job(s) to run for foo : 1 2018-01-19 08:26:12 [INFO] Ask job #0 for test0.489907001516350372 : foo .2018-01-19 08:26:14 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serie.json' 2018-01-19 08:26:14 [INFO] Setting consumer group to manager 2018-01-19 08:26:14 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" 2018-01-19 08:26:14 [INFO] πŸš€ Starting workflow test0.616312001516350374 2018-01-19 08:26:14 [INFO] Nb job(s) to run for foo : 1 2018-01-19 08:26:14 [INFO] Ask job #0 for test0.616312001516350374 : foo 2018-01-19 08:26:14 [INFO] πŸš€ Starting workflow test0.616312001516350374 2018-01-19 08:26:14 [ERROR] Failed to start workflow : Vpg\Disturb\Workflow\ManagerService::init : Failed to init workflow 'test0.616312001516350374' : existing context ..2018-01-19 08:26:15 [INFO] Loading Workflow config from '/home/travis/build/vpg/disturb/Tests/Config/serieWrongClientClass.json' 2018-01-19 08:26:15 [INFO] Setting consumer group to manager 2018-01-19 08:26:15 [INFO] Connecting to Elastic "http:\/\/127.0.0.1:9200" ....F.. 58 / 58 (100%) Time: 12.34 seconds, Memory: 12.00MB There was 1 failure: 1) Tests\Library\Workflow\WorkflowConfigDtoFactoryTest::testDTOMissingProps Failed asserting that exception of type "Vpg\Disturb\Workflow\InvalidWorkflowConfigException" is thrown. FAILURES! Tests: 58, Assertions: 106, Failures: 1. Generating code coverage report in Clover XML format ... done ```


coveralls commented 6 years ago

Coverage Status

Coverage remained the same at 82.84% when pulling bc66c95df4b796f6183e7a00f798584ec0f79f98 on worker_ut into 9dfc707b268eb0df7a1388db47c3e400a0597b25 on alpha.