KoboldAI / KoboldAI-Client

https://koboldai.com
GNU Affero General Public License v3.0
3.45k stars 743 forks source link

[Linux, Recommended install] play-rocm.sh immediately fatally crashes with huggingface_hub import error #445

Open ProfessorDey opened 1 month ago

ProfessorDey commented 1 month ago

So following the recommended install method of cloning the client repository and running ./play-rocm.sh, the install goes smoothly then as soon as it starts to load kobold itself it immediately crashes with this error what I can't any reason for. Huggingface_hub is installed, but looking online I can only find 'split_torch_state_dict_into_shards' on their github under src/serialisation/_torch.py, which might mean there is a missing submodule in the ROCm requirements. This is unfortunate because it means I can't use Kobold at all as it won't even start.

Traceback (most recent call last): File "aiserver.py", line 58, in from utils import debounce File "/home/dey/ai/koboldai-client/utils.py", line 12, in from transformers import PreTrainedModel File "", line 1039, in _handle_fromlist File "/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1066, in getattr module = self._get_module(self._class_to_module[name]) File "/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1078, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback): cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/huggingface_hub/init.py)

henk717 commented 1 month ago

This branch is not supported for online installers, you can check the one in my branch if that works. You can also check https://koboldai.org/cpp for our more modern sister project that has wider GPU support.

ProfessorDey commented 1 month ago

Fair enough, I was just following the instructions in the Readme: Installing KoboldAI on Linux using the KoboldAI Runtime (Easiest) Clone the URL of this Github repository (For example git clone https://github.com/koboldai/koboldai-client ) AMD user? Make sure ROCm is installed if you want GPU support. Is yours not compatible with ROCm? Follow the usual instructions. Run play-rocm.sh if you use an AMD GPU supported by ROCm

I didn't realise it had been forked and continued in a seperate project, perhaps it would be worth making a note of that on the readme itself? Thank you for the redirection though, I'll give that a go.

henk717 commented 1 month ago

The idea is that mine gets upstreamed again, but the update has been such a big undertaking that its been taking a while especially since the AI space constantly changes and forces us to keep changing our backend to keep things functional. The newer developers all want to work on Koboldcpp instead since its currently the better one between the two.