Open Xeraster opened 3 weeks ago
I think the README.md is incorrectly suggesting Python 3.9 in one place (in another place it suggests 3.10...). The error in your logs suggests using 3.10 here:
beartype.roar.BeartypeDecorHintPep604Exception: Method rotary_embedding_torch.rotary_embedding_torch.RotaryEmbedding.__init__() parameter "custom_freqs" stringified PEP 604 type hint 'Tensor | None' syntactically invalid under Python < 3.10 (i.e., TypeError("unsupported operand type(s) for |: 'torch._C._TensorMeta' and 'NoneType'")). Consider either:
* Requiring Python >= 3.10. Abandon Python < 3.10 all ye who code here.
I ran into this issue and it looks like scipy changed their install method with version 14, I was able to work around it by editing the setup.py to install the last ver. "scipy==1.13.1" under install_requires
I ran into this issue and it looks like scipy changed their install method with version 14, I was able to work around it by editing the setup.py to install the last ver. "scipy==1.13.1" under install_requires
The above did work for me, but I got stuck with another
Installed /home/the100rabh/.conda/envs/tortoise/lib/python3.9/site-packages/safetensors-0.4.3-py3.9-linux-x86_64.egg
error: tokenizers 0.19.1 is installed but tokenizers!=0.11.3,<0.14,>=0.11.1 is required by {'transformers'}
Seems like there are several dependencies for me which are causing conflict. Any thoughts on what I can do to get this running. Sorry I am a n00b with python and pip configuration management.
For that one I changed it to 'tokenizers==0.13.4.rc3', since that was the last version <0.14
I keep getting an installation error on the last step:
How do I fix this? I followed the conda installation instructions step-by-step on Debian. I can't fix this error. Even if I try to manually copy that file to the tmp directory, the random folder name changes each time and it the installer wont touch it. How do we install this in Debian 12?
UPDATE: I found another fix from another thread tfix that one problem but then I ran into a series of other problems that I was ultimately unable to solve:
I changed "tokenizers" to "tokenizers==0.11.1". The installation finally finishes but the program doesn't actually work. For example if I try "python tortoise/socket_server.py ", I get the following output:
so then I tried adding: 'spacy', to my setup.py file. Now I get this:
Does anybody know how to fix this? How do I manually uninstall tokenizers 15? Where are those files stored? Can I manually installed these dependencies in conda? "pip install [package]" doesn't work.
ok so after trying a lot of different stuff, I think I finally tricked it into installing tokenizer 13.1. Now, running "python tortoise/socket_server.py" gets me this error:
any help would be greatly appreciated.
EDIT AGAIN: Ok, disregard everything up to the spacy issue. Turns out I had things freally messed up and actually, the following issues is the furthest I got:
can someone please post up to date installation instructions? The conda one 100% does not work, at least on Debian and neither does the single-line pip command because Debian doesn't let you use pip that way, then if I try to set up a python venv, I just get a torrent of cryptic errors and its hard to tell what it even wants.