NYU-Molecular-Pathology / NGS580-nf

Target exome sequencing analysis for NYU NGS580 gene panel
GNU General Public License v3.0
10 stars 6 forks source link

docker build error #8

Closed hmkim closed 5 years ago

hmkim commented 5 years ago

Hi!

Thank you for your code. (NGS580-nf).

When I try to build of Docker, I got the error in below.

$ git clone https://github.com/NYU-Molecular-Pathology/NGS580-nf.git
$ cd NGS580-nf/containers
$ make build-all-Docker

2019-01-03 09 34 28

How can I solve this issue ? Please check this issue.

stevekm commented 5 years ago

Hi @hmkim , thanks for reporting this issue. I should preface this by noting that I have not actively worked on or used the Docker configs in a while, since my production system is running Singularity, so its possible that things have gotten out of date and I hadn't noticed.

This is an error I have never seen before. Would you by chance know which Docker container caused it? It appears to be one of the ones that use conda with R packages, maybe one of the variant-calling or R or reporting containers?

From my experience building these containers with Docker and Singularity, I suspect there might be an issue with your hard disk configuration, especially in the amount of storage space available. On my Mac here, I have Docker configured for a 64GB Virtual disk image size, with 36GB used so far. I think this might be relevant because when system items mysteriously go missing like this inside the containers, it has often been due to the container silently running out of space. Singularity via Vagrant has been bad about this, though I would have expected Docker to throw a more explicit error if this were the case.

screen shot 2019-01-02 at 9 30 43 pm

If you are able to, try checking on this config on your system, and increase it a bit if possible and try it again. If that does not fix it, then it would help to know which exact container is causing the issue, so try building them individually with the make docker-build VAR=container-dir-name command until you hit the issue.

I am running the full make build-all-Docker command right now on my desktop but it might take a while to finish.

hmkim commented 5 years ago

Error with make docker-build VAR=deconstructSigs-1.8.0

2019-01-03 12 54 52


I think the other container directory seems ok.

I can see the list of images made sucessfully from container directory.

2019-01-03 13 17 51

Thank you for your support.

stevekm commented 5 years ago

It looks like those are actually two different errors. The latter one appears to be a potential network error, since it was unable to source the BioConductor installer (via the internet). Not sure if network issues could have caused the first error though. You might simply keep trying it a few times and see if it gives the same errors repeatedly and/or starts working. However I saw that you have your location listed as Korea, and I think I did actually hard-code in the USA repo for CRAN in the install.R script there, so that might also make things difficult for you.

I have uploaded an alternative version of this container, based on the current Singularity container. You can try it out by switching to the new branch:

git pull
git checkout deconstructSigs-fix

Let me know if this works better. Should be functionally identical to the old one but it installs everything from conda instead of CRAN and bioconductor.

stevekm commented 5 years ago

Also I was unable to replicate the errors you reported both with the new and old versions of that Docker container, they both build fine for me. So that also makes me wonder if it has something to do with either your or my system or locality. Let me know if there are any updates.

hmkim commented 5 years ago

Really thank you for your thoughtful comment.

In fact, I had a problem of system disk. Maybe one issue affected by this.

I'll try and report it.

Have an nice day!

hmkim commented 5 years ago

After pull & checkout, I compiled successfully.

$ git pull $ git checkout deconstructSigs-fix $ make docker-build VAR=deconstructSigs-1.8.0

Thank you!