Closed ajtucker closed 5 years ago
Build 'GSS_data/Migration/Scotland-overseas' is failing!
Last 50 lines of build output:
[...truncated 156 lines...]
------------------
[0;31m---------------------------------------------------------------------------[0m
[0;31mIndexError[0m Traceback (most recent call last)
[0;32m~var/jenkins_home/workspace/GSS_data/Migration/Scotland-overseas/migration-by-age-2001-to-2017-females.ipynb[0m in [0;36m<module>[0;34m[0m
[0;32m----> 1[0;31m [0mtab[0m [0;34m=[0m [0;34m[[0m[0mt[0m [0;32mfor[0m [0mt[0m [0;32min[0m [0mtabs[0m [0;32mif[0m [0mt[0m[0;34m.[0m[0mname[0m [0;34m==[0m [0;34m'SYOA Feales (2001-)'[0m[0;34m][0m[0;34m[[0m[0;36m0[0m[0;34m][0m[0;34m[0m[0;34m[0m[0m
[0m[1;32m 2[0m [0mcell[0m [0;34m=[0m [0mtab[0m[0;34m.[0m[0mfilter[0m[0;34m([0m[0;34m'Year'[0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[1;32m 3[0m [0mage[0m [0;34m=[0m [0mcell[0m[0;34m.[0m[0mfill[0m[0;34m([0m[0mRIGHT[0m[0;34m)[0m[0;34m.[0m[0mis_not_blank[0m[0;34m([0m[0;34m)[0m[0;34m.[0m[0mis_not_whitespace[0m[0;34m([0m[0;34m)[0m [0;34m|[0m [0mcell[0m[0;34m.[0m[0mshift[0m[0;34m([0m[0;36m0[0m[0;34m,[0m[0;36m1[0m[0;34m)[0m[0;34m.[0m[0mfill[0m[0;34m([0m[0mRIGHT[0m[0;34m)[0m[0;34m.[0m[0mis_not_blank[0m[0;34m([0m[0;34m)[0m[0;34m.[0m[0mis_not_whitespace[0m[0;34m([0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[1;32m 4[0m [0myear[0m [0;34m=[0m [0mcell[0m[0;34m.[0m[0mshift[0m[0;34m([0m[0;36m0[0m[0;34m,[0m[0;36m1[0m[0;34m)[0m[0;34m.[0m[0mexpand[0m[0;34m([0m[0mDOWN[0m[0;34m)[0m[0;34m.[0m[0mis_not_blank[0m[0;34m([0m[0;34m)[0m[0;34m.[0m[0mis_not_whitespace[0m[0;34m([0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[1;32m 5[0m [0mflow[0m [0;34m=[0m [0mtab[0m[0;34m.[0m[0mfilter[0m[0;34m([0m[0mcontains_string[0m[0;34m([0m[0;34m'migration'[0m[0;34m)[0m[0;34m)[0m[0;34m.[0m[0mis_not_blank[0m[0;34m([0m[0;34m)[0m[0;34m.[0m[0mis_not_whitespace[0m[0;34m([0m[0;34m)[0m[0;34m[0m[0;34m[0m[0m
[0;31mIndexError[0m: list index out of range
IndexError: list index out of range
[Pipeline] }
[Pipeline] // ansiColor
[Pipeline] }
[Pipeline] // script
[Pipeline] }
$ docker stop --time=1 af447428204334f1d6787fa76df7fde30e40604ce0114fddb4469ea3b9a8433d
$ docker rm -f af447428204334f1d6787fa76df7fde30e40604ce0114fddb4469ea3b9a8433d
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Validate CSV)
Stage "Validate CSV" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Upload Tidy Data)
Stage "Upload Tidy Data" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test draft dataset)
Stage "Test draft dataset" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Publish)
Stage "Publish" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] step
Changes since last successful build:
[Alex Tucker] d304a5acc8766e52976306021fa62c6ab7bae04c - Add template transformation pipeline.
[Alex Tucker] 7eda5c2f5f1a251462cd0a2820b201263fd36e11 - Add getDraftsetEndpoint
[Alex Tucker] 7c9c4e98b4ee0c634379ab5253c3293b426e7301 - Add SPARQL tests step and publish results.
[Alex Tucker] 806951bf2ee882eb722ed68d14041b7f0e928fea - SPARQL tests need union-with-live=true
[Alex Tucker] d747d17b909dea7f689436a623263ff531631c94 - Don't trigger GDP-tests as they're now run on the unpublished draftset.
[Alex Tucker] 57b7cae64f2422e744b548d8533c87ea96b0c7b0 - Template for multiple datasets.
[Alex Tucker] 554340c2f38e69909c77daea277188852640b339 - Disambiguate use of 'dataset'
[Alex Tucker] 1452178f3d07debcb12de21cb3782810370ff2c0 - Use $refFamily
[Alex Tucker] 0ab4797047cdc3a26e01fe2cfd6da3da461c57f3 - Try to always update Trello card.
[Alex Tucker] 2c3a2d2dcad68f95080061335209f3d7241b6af9 - If main.py exists, convert to notebook using jupytext.
[Alex Tucker] 1acd29462408540f9fbc087e739c4dd458b20af7 - Convert any .py files to .ipynb using jupytext.
No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes
[vtula2000] 2534cb329ceffb69653ada0a472812ca2ccb10e2 - Data Baker errors
[vtula2000] a07e390200b402772a26040edf55b3a328f3c17f - Error
[vtula2000] 51b6e97a17334f84f7452df5a77e3912aa03e051 - Errors
[Alex Tucker] 0ab9caa46fd9a9ce37f92f6ff1886ef7d0fad336 - Use GitHub issues rather than updating a Trello card.
[Alex Tucker] b57d083813b089ab8107b2342f26cf94bf589f8b - Use gsscogs Docker images.
No changes
[Alex Tucker] cc2bca87d857753755272fa5aaa08034ec2428f6 - Add template for ref_* pipelines.
[Alex Tucker] fd491cc35f5f1a690f227435207d77378bce174e - Also validate columns.csv and components.csv.
No changes
[Alex Tucker] 5f5d4b46d7d338090ecfa2d02ede7ae8c0a498da - Ignore RequestAborted exceptions in empty_cache/sync_search calls.
[Alex Tucker] ef905fd5f2db85565dc4eb0bde049534e4846b0f - Use --no-verbose to cut down line length of csvlint output, so it can be
[mikelovesbooks] e5c21ac5739e95253ebfc199f10a5e6b87f2e024 - update to new website taxonomy, update explicit row numbers
No changes No changes No changes No changes
[vtula2000] 8650cfd5df2eb4c43840368b4534201206f95d84 - .py and schema update
[mikelovesbooks] 60a9955031bd2846e02ccde85223f07ac1eca41d - moved to databaker to deal with duplication from explicit row slicing
[mikelovesbooks] 05642c0ba31280fe2f68a024000c25b1cdae8f60 - add source column, add temporary fix for precision differnces
Build 'GSS_data/Migration/Scotland-overseas' is failing!
Last 50 lines of build output:
[...truncated 231 lines...]
------------------
[0;31m---------------------------------------------------------------------------[0m
[0;31mValueError[0m Traceback (most recent call last)
[0;32m<ipython-input-27-de2f78ef2b19>[0m in [0;36m<module>[0;34m[0m
[1;32m 42[0m [0;32melse[0m[0;34m:[0m[0;34m[0m[0;34m[0m[0m
[1;32m 43[0m raise ValueError("Aborting. We have duplicate numbers that appear to be differentiated by "
[0;32m---> 44[0;31m "more than just rounding off, these: {}".format(",".join([str(x) for x in both_values])))
[0m[1;32m 45[0m [0;34m[0m[0m
[1;32m 46[0m [0;31m# Now we compare precision and drop the least precise[0m[0;34m[0m[0;34m[0m[0;34m[0m[0m
[0;31mValueError[0m: Aborting. We have duplicate numbers that appear to be differentiated by more than just rounding off, these: 12000,11985
ValueError: Aborting. We have duplicate numbers that appear to be differentiated by more than just rounding off, these: 12000,11985
[Pipeline] }
[Pipeline] // ansiColor
[Pipeline] }
[Pipeline] // script
[Pipeline] }
$ docker stop --time=1 4de491a9d5e255a6e843e4706cf48ca1268a795876ef2dd5b64757e60f180718
$ docker rm -f 4de491a9d5e255a6e843e4706cf48ca1268a795876ef2dd5b64757e60f180718
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Validate CSV)
Stage "Validate CSV" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Upload Tidy Data)
Stage "Upload Tidy Data" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test draft dataset)
Stage "Test draft dataset" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Publish)
Stage "Publish" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] step
Changes since last successful build:
[Alex Tucker] d304a5acc8766e52976306021fa62c6ab7bae04c - Add template transformation pipeline.
[Alex Tucker] 7eda5c2f5f1a251462cd0a2820b201263fd36e11 - Add getDraftsetEndpoint
[Alex Tucker] 7c9c4e98b4ee0c634379ab5253c3293b426e7301 - Add SPARQL tests step and publish results.
[Alex Tucker] 806951bf2ee882eb722ed68d14041b7f0e928fea - SPARQL tests need union-with-live=true
[Alex Tucker] d747d17b909dea7f689436a623263ff531631c94 - Don't trigger GDP-tests as they're now run on the unpublished draftset.
[Alex Tucker] 57b7cae64f2422e744b548d8533c87ea96b0c7b0 - Template for multiple datasets.
[Alex Tucker] 554340c2f38e69909c77daea277188852640b339 - Disambiguate use of 'dataset'
[Alex Tucker] 1452178f3d07debcb12de21cb3782810370ff2c0 - Use $refFamily
[Alex Tucker] 0ab4797047cdc3a26e01fe2cfd6da3da461c57f3 - Try to always update Trello card.
[Alex Tucker] 2c3a2d2dcad68f95080061335209f3d7241b6af9 - If main.py exists, convert to notebook using jupytext.
[Alex Tucker] 1acd29462408540f9fbc087e739c4dd458b20af7 - Convert any .py files to .ipynb using jupytext.
No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes
[vtula2000] 2534cb329ceffb69653ada0a472812ca2ccb10e2 - Data Baker errors
[vtula2000] a07e390200b402772a26040edf55b3a328f3c17f - Error
[vtula2000] 51b6e97a17334f84f7452df5a77e3912aa03e051 - Errors
[Alex Tucker] 0ab9caa46fd9a9ce37f92f6ff1886ef7d0fad336 - Use GitHub issues rather than updating a Trello card.
[Alex Tucker] b57d083813b089ab8107b2342f26cf94bf589f8b - Use gsscogs Docker images.
No changes
[Alex Tucker] cc2bca87d857753755272fa5aaa08034ec2428f6 - Add template for ref_* pipelines.
[Alex Tucker] fd491cc35f5f1a690f227435207d77378bce174e - Also validate columns.csv and components.csv.
No changes
[Alex Tucker] 5f5d4b46d7d338090ecfa2d02ede7ae8c0a498da - Ignore RequestAborted exceptions in empty_cache/sync_search calls.
[Alex Tucker] ef905fd5f2db85565dc4eb0bde049534e4846b0f - Use --no-verbose to cut down line length of csvlint output, so it can be
[mikelovesbooks] e5c21ac5739e95253ebfc199f10a5e6b87f2e024 - update to new website taxonomy, update explicit row numbers
No changes No changes No changes No changes
[vtula2000] 8650cfd5df2eb4c43840368b4534201206f95d84 - .py and schema update
[mikelovesbooks] 60a9955031bd2846e02ccde85223f07ac1eca41d - moved to databaker to deal with duplication from explicit row slicing
[mikelovesbooks] 05642c0ba31280fe2f68a024000c25b1cdae8f60 - add source column, add temporary fix for precision differnces
[mikelovesbooks] 1f5bf662ed200a206cee9f7fd2d29ee720decf83 - fix typo
Build was fixed!
Build 'GSS_data/Migration/Scotland-overseas' is failing!
Last 50 lines of build output:
Changes since last successful build:
[Alex Tucker] d304a5acc8766e52976306021fa62c6ab7bae04c - Add template transformation pipeline.
[Alex Tucker] 7eda5c2f5f1a251462cd0a2820b201263fd36e11 - Add getDraftsetEndpoint
[Alex Tucker] 7c9c4e98b4ee0c634379ab5253c3293b426e7301 - Add SPARQL tests step and publish results.
[Alex Tucker] 806951bf2ee882eb722ed68d14041b7f0e928fea - SPARQL tests need union-with-live=true
[Alex Tucker] d747d17b909dea7f689436a623263ff531631c94 - Don't trigger GDP-tests as they're now run on the unpublished draftset.
[Alex Tucker] 57b7cae64f2422e744b548d8533c87ea96b0c7b0 - Template for multiple datasets.
[Alex Tucker] 554340c2f38e69909c77daea277188852640b339 - Disambiguate use of 'dataset'
[Alex Tucker] 1452178f3d07debcb12de21cb3782810370ff2c0 - Use $refFamily
[Alex Tucker] 0ab4797047cdc3a26e01fe2cfd6da3da461c57f3 - Try to always update Trello card.
[Alex Tucker] 2c3a2d2dcad68f95080061335209f3d7241b6af9 - If main.py exists, convert to notebook using jupytext.
[Alex Tucker] 1acd29462408540f9fbc087e739c4dd458b20af7 - Convert any .py files to .ipynb using jupytext.
No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes No changes
[vtula2000] 2534cb329ceffb69653ada0a472812ca2ccb10e2 - Data Baker errors
[vtula2000] a07e390200b402772a26040edf55b3a328f3c17f - Error
[vtula2000] 51b6e97a17334f84f7452df5a77e3912aa03e051 - Errors
[Alex Tucker] 0ab9caa46fd9a9ce37f92f6ff1886ef7d0fad336 - Use GitHub issues rather than updating a Trello card.
[Alex Tucker] b57d083813b089ab8107b2342f26cf94bf589f8b - Use gsscogs Docker images.
No changes
[Alex Tucker] cc2bca87d857753755272fa5aaa08034ec2428f6 - Add template for ref_* pipelines.
[Alex Tucker] fd491cc35f5f1a690f227435207d77378bce174e - Also validate columns.csv and components.csv.
No changes
[Alex Tucker] 5f5d4b46d7d338090ecfa2d02ede7ae8c0a498da - Ignore RequestAborted exceptions in empty_cache/sync_search calls.
[Alex Tucker] ef905fd5f2db85565dc4eb0bde049534e4846b0f - Use --no-verbose to cut down line length of csvlint output, so it can be
[mikelovesbooks] e5c21ac5739e95253ebfc199f10a5e6b87f2e024 - update to new website taxonomy, update explicit row numbers
No changes No changes No changes No changes
View full output