ZGC-LLM-Safety / TrafficLLM

The repository of TrafficLLM, a universal LLM adaptation framework to learn robust traffic representation for all open-sourced LLM in real-world scenarios and enhance the generalization across diverse traffic analysis tasks.
74 stars 12 forks source link

ModuleNotFoundError: No module named 'transformers_modules.models/chatglm2/chatglm2-6b/tokenization_chatglm' #6

Open ImmEve opened 1 week ago

ImmEve commented 1 week ago

我将chatglm2下载到如下图的目录 image 但报错说没有指定的模型

CuiTianyu961030 commented 1 week ago

Could you please tell us which command caused the above error? This will help us locate the root cause of the problem.

If you are using ChatGLM2 as the base model, please pay attention to the instructions of the official link for the model usage.

ImmEve commented 1 week ago

Sorry, I'll go through my procedures in detail.

First I git clone your project locally, then I download the weights for chatglm-6b to the models/chatglm2/chatglm2-6b directory, shown as above.

I then tried python interface.py--config=config.json --prompt="Your InstructionText + : + Traffic Data" and streamlit run trafficllm_server.py modes report the same error, that is, the error in the title.

CuiTianyu961030 commented 1 week ago

Please check whether you have correctly installed the inference environment of ChatGLM2. You can follow step 1 to complete the environment preparation. After completing the environment preparation, please make sure the model path in config.json is correct.

I hope this reply can help you.