Open jvivian opened 7 years ago
@fnothaft — In spark.py I added the subprocess. prefix to some invocations of check_call and check_output, if you want me to revert that let me know.
No, that looks correct to me. I think I have that exact fix sitting on a branch somewhere. Anywho, if it winds up causing an issue, I'll back it out when I'm retesting these soon.
@fnothaft — Only failing test is sparktest due to a service deadlock. I don't have much experience with that if you're able to take a look. If you don't have the bandwidth let me know!
@jvivian mind if I take a look next week? I'm slammed today.
@fnothaft — no problem, I don't think there's any rush on this.
@fnothaft — We're going to make a new stable release for toil-lib after this is merged which we'll need by the end of the week, so let me know if you can't take a look before then.
@jvivian sounds good; I will give it a looksee tonight.
Sorry for the delay; looking at the service deadlock now.
Sorry for the delay; looking at the service deadlock now.
Thanks mate!
No prob; sorry for the delay; this is a crazy week. BTW, I am getting test failures due to a 403 error when downloading s3://cgl-pipeline-inputs/exome/ci/chr6.normal.bam which is needed for the tests. I don't think this file is accessible to folks outside of UCSC.
Also, looks good on my side:
(toil_lib)dhcp-46-233:toil-lib fnothaft$ git status
HEAD detached at upstream/issues/68-remove-dockercall
Untracked files:
(use "git add <file>..." to include in what will be committed)
Makefile~
src/toil_lib/programs.py~
src/toil_lib/spark.py~
src/toil_lib/test/__init__.py~
src/toil_lib/test/test_programs.py~
src/toil_lib/test/test_spark.py~
toil_lib/
nothing added to commit but untracked files present (use "git add" to track)
(toil_lib)dhcp-46-233:toil-lib fnothaft$ git log | head -n 10
commit 443608d6bbd812956aaee5bc62fe96b7ad9d84dc
Author: John Vivian <jtvivian@gmail.com>
Date: Fri Jan 20 01:14:00 2017 -0800
Fix test_urls.py
commit 8a827aef2cbdb9afb49d668fe8ddb21cd8700e81
Author: John Vivian <jtvivian@gmail.com>
Date: Fri Jan 20 01:06:19 2017 -0800
(toil_lib)dhcp-46-233:toil-lib fnothaft$ make test
PATH=$PATH:/Users/fnothaft/IdeaProjects/toil-lib/bin python2.7 -m pytest -vv --junitxml test-report.xml src
============================================================================================== test session starts ==============================================================================================
platform darwin -- Python 2.7.10, pytest-2.8.3, py-1.4.31, pluggy-0.3.1 -- /Users/fnothaft/IdeaProjects/toil-lib/toil_lib/bin/python2.7
cachedir: .cache
rootdir: /Users/fnothaft/IdeaProjects/toil-lib, inifile:
collected 14 items
src/toil_lib/test/test_files.py::test_tarball_files PASSED
src/toil_lib/test/test_files.py::test_copy_files PASSED
src/toil_lib/test/test_files.py::test_consolidate_tarballs_job PASSED
src/toil_lib/test/test_files.py::test_generate_file PASSED
src/toil_lib/test/test_jobs.py::test_map_job PASSED
src/toil_lib/test/test_lib.py::test_flatten PASSED
src/toil_lib/test/test_lib.py::test_partitions PASSED
src/toil_lib/test/test_spark.py::SparkTest::testSparkLocal PASSED
:p
Let me look at the Jenkins logs and see if I can divine what is failing.
@benedictpaten — You're most familiar with the service code if you're able to take a quick look at the jenkins log.
@benedictpaten — Any chance you can take a look at this?
@jvivian the deadlock fix has been merged into Toil so this should be unblocked
@cket the tests here fail due to a bucket permission problem. The relevant S3 path is s3://cgl-driver-projects/test/b081862b-a069-424c-8b65-9529c4ab064d/upload_file
. Do we have a new bucket to use?
@fnothaft I just gave proper permissions to the toil-dev account. Jenkins, test this please.
There's some repetition in the job functions that use
JAVA_OPTS
orJAVA_OPTIONS
since I have to pass indockerParameters
, so I can refactor that out and import it if people want.@fnothaft — In spark.py I added the
subprocess.
prefix to some invocations ofcheck_call
andcheck_output
, if you want me to revert that let me know.