henk717 / KoboldAI

KoboldAI is generative AI software optimized for fictional use, but capable of much more!
http://koboldai.com
GNU Affero General Public License v3.0
371 stars 134 forks source link

**trivial** workaround so you don't have to run micromamba twice #489

Open BlairSadewitz opened 11 months ago

BlairSadewitz commented 11 months ago

kai_micromamba-fix.patch

It was annoying me that I had to wait for the thing to do its thing twice, and I decided that was the last time I was ever gonna deal with that again. Well, some time elapsed, and I fixed it. Now it doesnt choke. Now, why does this work? That is above my pay grade.

Seriously, though, sometimes it's REALLY annoying to have to run it twice--at least for me.

Updated to latest revision. Maybe should quote variables.

diff --git a/install_requirements.sh b/install_requirements.sh index f73408cd..36f6c4bf 100755 --- a/install_requirements.sh +++ b/install_requirements.sh @@ -1,25 +1,20 @@

!/bin/bash

export PYTHONNOUSERSITE=1 git submodule update --init --recursive +MAMBA_ROOT_PREFIX="${PWD}/runtime" if [[ $1 = "cuda" || $1 = "CUDA" ]]; then wget --no-iri -qO- https://anaconda.org/conda-forge/micromamba/1.5.3/download/linux-64/micromamba-1.5.3-0.tar.bz2 | tar -xvj bin/micromamba -bin/micromamba create -f environments/huggingface.yml -r runtime -n koboldai -y -# Weird micromamba bug causes it to fail the first time, running it twice just to be safe, the second time is much faster -bin/micromamba create -f environments/huggingface.yml -r runtime -n koboldai -y +bin/micromamba create -f environments/huggingface.yml -r ${MAMBA_ROOT_PREFIX} -p ${MAMBA_ROOT_PREFIX}/envs/koboldai -y exit fi if [[ $1 = "rocm" || $1 = "ROCM" ]]; then wget --no-iri -qO- https://anaconda.org/conda-forge/micromamba/1.5.3/download/linux-64/micromamba-1.5.3-0.tar.bz2 | tar -xvj bin/micromamba -bin/micromamba create -f environments/rocm.yml -r runtime -n koboldai-rocm -y -# Weird micromamba bug causes it to fail the first time, running it twice just to be safe, the second time is much faster -bin/micromamba create -f environments/rocm.yml -r runtime -n koboldai-rocm -y +bin/micromamba create -f environments/huggingface.yml -r ${MAMBA_ROOT_PREFIX} -p ${MAMBA_ROOT_PREFIX}/envs/koboldai-rocm -y exit fi if [[ $1 = "ipex" || $1 = "IPEX" ]]; then wget --no-iri -qO- https://anaconda.org/conda-forge/micromamba/1.5.3/download/linux-64/micromamba-1.5.3-0.tar.bz2 | tar -xvj bin/micromamba -bin/micromamba create -f environments/ipex.yml -r runtime -n koboldai-ipex -y -# Weird micromamba bug causes it to fail the first time, running it twice just to be safe, the second time is much faster -bin/micromamba create -f environments/ipex.yml -r runtime -n koboldai-ipex -y +bin/micromamba create -f environments/huggingface.yml -r ${MAMBA_ROOT_PREFIX} -p ${MAMBA_ROOT_PREFIX}/envs/koboldai-ipex -y exit fi echo Please specify either CUDA or ROCM or IPEX