wenda-LLM / wenda

闻达:一个LLM调用平台。目标为针对特定环境的高效内容生成,同时考虑个人和中小企业的计算资源局限性,以及知识安全和私密性问题
GNU Affero General Public License v3.0
6.23k stars 810 forks source link

llama2无法调用 #452

Closed Hughhuh closed 1 year ago

Hughhuh commented 1 year ago

Exception in thread Thread-1 (load_model): Traceback (most recent call last): File "D:\BaiduNetdiskDownload\wenda\wenda\wenda\WPy64-31110\python-3.11.1.amd64\Lib\threading.py", line 1038, in _bootstrap_inner self.run() File "D:\BaiduNetdiskDownload\wenda\wenda\wenda\WPy64-31110\python-3.11.1.amd64\Lib\threading.py", line 975, in run self._target(*self._args, **self._kwargs) File "D:\BaiduNetdiskDownload\wenda\wenda\wenda\wenda\wenda.py", line 53, in load_model LLM.load_model() File "D:\BaiduNetdiskDownload\wenda\wenda\wenda\wenda\llms\llm_llama.py", line 295, in load_model tokenizer = AutoTokenizer.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\BaiduNetdiskDownload\wenda\wenda\wenda\WPy64-31110\python-3.11.1.amd64\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 655, in from_pretrained raise ValueError( ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported. No sentence-transformers model found with name model/m3e-base. Creating a new one with MEAN pooling.

linuxdevopscn commented 1 year ago

卸载Transformers 安装最新的 pip uninstall Transformers pip install Transformers

Hughhuh commented 1 year ago

我用的是懒人包,直接用最新的就行了吗

l15y commented 1 year ago

更新可解