Closed mfdj2002 closed 3 months ago
a little difference:
python run_plm.py --test --plm-type llama --plm-size base --rank 128 --device cuda:0 --model-dir data/ft_plms/try_llama2_7b
Traceback (most recent call last):
File "run_plm.py", line 23, in
I installed all dependencies following the readme.
pip show huggingface_hub Name: huggingface-hub Version: 0.17.3
I tried the following solutions:
Everything I tried failed😭
Hi! Thanks for your messages.
We did not meet this error. Did you run our codes on linux?
FYI, here is the huggingface_hub information on our testbed: Name: huggingface-hub Version: 0.17.3 Summary: Client library to download and publish models, datasets and other repos on the huggingface.co hub Home-page: https://github.com/huggingface/huggingface_hub Author: Hugging Face, Inc. Author-email: julien@huggingface.co License: Apache Location: xxx Requires: filelock, fsspec, packaging, pyyaml, requests, tqdm, typing-extensions Required-by: accelerate, datasets, tokenizers, transformers
Hi! Thanks for your response. We eventually solved the problem using the following environment:
$ cat netllm.yaml name: abr_netllm channels:
python run_plm.py --adapt --grad-accum-steps 32 --plm-type llama --plm-size base --rank 128 --device cuda:0 --lr 0.0001 --warmup-steps 2000 --num-epochs 80 --eval-per-epoch 2
Traceback (most recent call last): File "run_plm.py", line 23, in
from plm_special.models.low_rank import peft_model
File "/data/workspace/fujingkai/NetLLM/adaptive_bitrate_streaming/plm_special/models/low_rank.py", line 3, in
from peft import LoraConfig, get_peft_model, TaskType, get_peft_model_state_dict
File "/data/workspace/fujingkai/.conda/envs/abr_netllm/lib/python3.8/site-packages/peft/init.py", line 22, in
from .auto import (
File "/data/workspace/fujingkai/.conda/envs/abr_netllm/lib/python3.8/site-packages/peft/auto.py", line 21, in
from transformers import (
File "/data/workspace/fujingkai/.conda/envs/abr_netllm/lib/python3.8/site-packages/transformers/init.py", line 26, in
from . import dependency_versions_check
File "/data/workspace/fujingkai/.conda/envs/abr_netllm/lib/python3.8/site-packages/transformers/dependency_versions_check.py", line 16, in
from .utils.versions import require_version, require_version_core
File "/data/workspace/fujingkai/.conda/envs/abr_netllm/lib/python3.8/site-packages/transformers/utils/init.py", line 18, in
from huggingface_hub import get_full_repo_name # for backward compatibility
ImportError: cannot import name 'get_full_repo_name' from 'huggingface_hub' (/data/workspace/fujingkai/.conda/envs/abr_netllm/lib/python3.8/site-packages/huggingface_hub/init.py)
pip show huggingface_hub Name: huggingface-hub Version: 0.17.3 ...
I installed all dependencies following the readme.
I tried solutions in https://github.com/comfyanonymous/ComfyUI/issues/2055, but none of them worked for me. I also tried using another version of huggingface_hub, but tokenizers 0.14.1 requires huggingface_hub<0.18,>=0.16.4 and transformers 4.34.1 (specified by the requirements you mentioned) requires tokenizers<0.15,>=0.14.