Closed ChuanzhengWei closed 3 months ago
Hi @ChuanzhengWei,
It's okay if you cannot install the intel MKL and the Python module sparse_dot_mkl
. Missing these dependencies will not have any impact on your results, but will only slow down the Markov clustering process when you have more than ten thousand contigs.
Best wishes, Xiaofei
Intel no longer hosts many packages due to anaconda policy change (https://community.intel.com/t5/Intel-Integrated-Performance/Problems-installing-with-conda-HTTP-403-FORBIDDEN/td-p/1611876). I will fix this problem by updating a yml file as soon as possible.
Thank you for your response. I suggest it would be more convenient to provide a container, similar to HiCpro, in Docker or Singularity format. In my workplace's cluster, running conda is extremely slow :(! This would facilitate the installation process.
If you find that conda is running extremely slow, it is likely because you have installed all dependencies in the same environment (base). By following our tutorial, a separate environment (haphic) will be created, resulting in faster performance. Many users may not have the permissions to install Docker or Singularity. Therefore, I currently do not have any plans to provide a container in the near future.
Intel conda channel has been suspended for the past two months. I have installed mkl from conda-forge:
conda install conda-forge::mkl
hope this helps
Thank you for your response. I look forward to the new version that does not rely on the intel channel. ^-^
I have updated the Conda environments for every Python version. I would appreciate it if you could give it a try.
Thanks! I will try it.
I am not sure what happened but when I loaded the yml file with mamba it Erased my .condarc because you specified the channels in the =f environment.
maybe it would be better if you have time to change to use mamba instead. and to not specify the channels in the yml but rather do like conda -f environment.yml -c channel
or to do like conda-forge::package?
cat ~/.condarc
conda config --remove channels defaults conda config --add channels bioconda conda config --add channels conda-forge
It should work fine after that.
Hi, Xiaofei,
I am reaching out to inquire if it would be possible to provide a container for downloading. I am currently working on a cluster, and I am unable to use the Intel channel to download the three required programs (intel-openmp, mkl, tbb) through the yml file installation.
I am not certain about the impact of missing these three files on the operation of Haphic. It seems that others, including #38 have encountered a similar issue.
Thank you for your time and support. Best regards!