icbi-lab / immune_deconvolution_benchmark

Reproducible pipeline for "Comprehensive evaluation of cell-type quantification methods for immuno-oncology", Sturm et al. 2019, https://doi.org/10.1093/bioinformatics/btz363
https://icbi-lab.github.io/immune_deconvolution_benchmark
BSD 3-Clause "New" or "Revised" License
41 stars 14 forks source link

Specifications were found to be in conflict during the installation #24

Closed HelloYiHan closed 4 years ago

HelloYiHan commented 5 years ago

Hi Gregor, Thank you for your excellent work and I am really interested in this package. However, when I installed it on the Google Colab, there was something wrong. I searched some websites but did not solve it. So I was wondering if you could give me some instruction. Here is the code and the result. commands: !wget -c https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh !chmod +x Miniconda3-latest-Linux-x86_64.sh !bash ./Miniconda3-latest-Linux-x86_64.sh -b -f -p /usr/local !conda install -q -y --prefix /usr/local -c bioconda -c conda-forge snakemake import sys sys.path.append('/usr/local/lib/python3.6/site-packages/') !git clone --recurse-submodules -j8 git://github.com/grst/immune_deconvolution_benchmark.git import os os.chdir(os.path.join(os.getcwd(),'immune_deconvolution_benchmark')) !snakemake --use-conda

results: Cloning into 'immune_deconvolution_benchmark'... remote: Enumerating objects: 4, done. remote: Counting objects: 100% (4/4), done. remote: Compressing objects: 100% (4/4), done. remote: Total 2345 (delta 1), reused 2 (delta 0), pack-reused 2341 Receiving objects: 100% (2345/2345), 151.55 MiB | 48.45 MiB/s, done. Resolving deltas: 100% (1425/1425), done. Submodule 'immunedeconv' (https://github.com/grst/immunedeconv) registered for path 'immunedeconv' Cloning into '/content/immune_deconvolution_benchmark/immunedeconv'... remote: Enumerating objects: 7, done.
remote: Counting objects: 100% (7/7), done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 1669 (delta 1), reused 3 (delta 0), pack-reused 1662
Receiving objects: 100% (1669/1669), 47.93 MiB | 39.42 MiB/s, done. Resolving deltas: 100% (1157/1157), done. Submodule path 'immunedeconv': checked out '14ba8691cd1ac57614cd73b02c80a9b0d20aa79b' Building DAG of jobs... Creating conda environment envs/bookdown.yml... Downloading remote packages. CreateCondaEnvironmentException: Could not create conda environment from /content/immune_deconvolution_benchmark/envs/bookdown.yml: Collecting package metadata: ...working... done Solving environment: ...working... failed

UnsatisfiableError: The following specifications were found to be in conflict:

Thank you again for your time. Yi Han

grst commented 5 years ago

Appears to be related to https://github.com/grst/immunedeconv/issues/10. A nasty version conflict, but I'm on it.

grst commented 5 years ago

Seems to be caused by a recent conda compiler update. See https://github.com/conda/conda/issues/8413.

r-tibble=1.4.2 -> r-pillar >=1.10 

r-pillar does not provide a version for R 3.4.1 rendered with the latest compiler.

grst commented 5 years ago

I updated the environment to reference the cf201901 label of the conda-forge channel. @HelloYiHan, can you try if that fixed the issue for you?

HelloYiHan commented 5 years ago

Hi, sorry for my late reply. Thank you for your help and the package could be installed. However, when I run !snakemake --use-conda in Google Colab, there were some other problems. Maybe the timer could not work and the error is : ...... >>> Running quantiseq

Running quanTIseq deconvolution module

Gene expression normalization and re-annotation (arrays: FALSE)

Removing 17 noisy genes

Removing 15 genes with high expression in tumors

Signature genes found in data set: 137/138 (99.28%)

Mixture deconvolution (method: lsei)

Deconvolution sucessful!

>>> Running xcell

>>> Running cibersort

>>> Running cibersort_abs

>>> Running timer ## Enter batch mode

## Loading immune gene expression

## Removing the batch effect of /tmp/RtmpoTbMc4/file7d82cf29b8a

Found2batches Adjusting for0covariate(s) or covariate level(s) Fitting L/S model and finding priors Finding parametric adjustments Adjusting the Data

Quitting from lines 570-591 (_main.Rmd) Error in { :  task 5 failed - "unused arguments (absolute = absolute, abs_method = abs_method)" Calls: ... withCallingHandlers -> withVisible -> eval -> eval -> %do% -> In addition: Warning messages: 1: Transformation introduced infinite values in continuous x-axis 2: Removed 3407 rows containing non-finite values (stat_bin). 3: In EPIC::EPIC(bulk = gene_expression_matrix, reference = ref, mRNA_cell = mRNA_cell, :  mRNA_cell value unknown for some cell types: CAFs, Endothelial - using the default value of 0.4 for these but this might bias the true cell proportions from all cell types.

Execution halted [Mon Jun 17 08:27:27 2019] Error in rule book:   jobid: 0  output: results/book/index.html, results/cache/.dir, results/figures/schelker_single_cell_tsne.pdf, results/figures/spillover_migration_chart.jpg, results/figures/spillover_migration_all.pdf, results/tables/mixing_study_correlations.tsv, results/tables/spillover_signal_noise.tsv  conda-env: /content/immune_deconvolution_benchmark/.snakemake/conda/a247575a  shell:

  touch results/cache/.dir  rm -f results/book/figures && ln -s ../figures results/book/figures  cd notebooks && Rscript -e "bookdown::render_book('index.Rmd')"   (exited with non-zero exit code)

Removing output files of failed job book since they might be corrupted: results/cache/.dir, results/figures/schelker_single_cell_tsne.pdf Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /content/immune_deconvolution_benchmark/.snakemake/log/2019-06-17T081149.022840.snakemake.log

grst commented 5 years ago

While the pipeline fails on Google colab for me in a later stage, I cannot reproduce this particular error. Would you mind sharing the colab notebook with me (grst768@gmail.com)?

grst commented 5 years ago

Your issue might also be related to insufficient memory per core. The pipeline requires at least 8GB of RAM per core, if less is available, it can fail with weird error messages. See also the new FAQs in the README.

Normally you adjust this by editing config.R. You can use the following oneliner to reduce the max. number of cores to 1 in google colab:

!sed -i 's/16/1/g' notebooks/config.R
HelloYiHan commented 5 years ago

ok, I will try it. Thank you for your kind help.