wxywb / history_rag

841 stars 109 forks source link

运行时提示ImportError: ('Cannot import sentence-transformers or torch package,', 'please `pip install torch sentence-transformers`') #52

Closed von-eureka closed 7 months ago

von-eureka commented 7 months ago

已经安装了sentence-transformers==2.5.0torch==2.2.1但是仍然提示无法导入

('Cannot import sentence-transformers or torch package,', 'please pip install torch sentence-transformers') ModuleNotFoundError: No module named 'sentence_transformers' During handling of the above exception, another exception occurred: File "/home/ureka/history_rag-master/executor.py", line 145, in init self.rerank_postprocessor = SentenceTransformerRerank( File "/home/ureka/history_rag-master/cli.py", line 30, in run self._executor = MilvusExecutor(conf) File "/home/ureka/history_rag-master/cli.py", line 120, in cli.run() ImportError: ('Cannot import sentence-transformers or torch package,', 'please pip install torch sentence-transformers')

wxywb commented 7 months ago

pip list |grep sentence

von-eureka commented 7 months ago

sentence-transformers 2.5.0

wxywb commented 7 months ago

这个是完整的stacktrace吗,为啥我没看出来是哪一行导致的,使用一下git log?

von-eureka commented 7 months ago

是的,但我这边没有配置git...... image

wxywb commented 7 months ago

你在cli.py开头(头两行)加上

import torch 
from sentence_transformers import SentenceTransformer

然后执行一下

von-eureka commented 7 months ago

直接报错,提示找不到sentence_transformers 模块了 image

wxywb commented 7 months ago

这意味着你可能存在多个python环境,你pip list的和你从ide里启动的python不是一个,你直接用命令行启动(pip list可以出来的那个环境)

von-eureka commented 7 months ago

我退出虚拟环境后跑了一次,没问题了。但是前进到构建索引的时候失败了,报错MilvusException: <MilvusException: (code=1, message=this version of sdk is incompatible with server, please downgrade your sdk or upgrade your server)> image

wxywb commented 7 months ago

pip list|grep milvus docker ps 分别看看

von-eureka commented 7 months ago
pip list|grep milvus
py milvus    2.3.6

docker ps
permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get "http://%2Fvar%2Frun%2Fdocker.sock/v1.24/containers/json": dial unix /var/run/docker.sock: connect: permission denied
wxywb commented 7 months ago

你的milvus没有启动吗?docker看起来没访问权限?不行走zillizpipeline方案?

von-eureka commented 7 months ago

启动了,又检查了一下,把我的用户加入到了docker用户组里,现在docker ps是没问题的,但是依旧报错。 image 报错如下: image

wxywb commented 7 months ago

这个咋是,2.0.2的呢,我repo里安的应该是2.3.0的

wxywb commented 7 months ago

你先试试pip install pymilvus==2.0.2

von-eureka commented 7 months ago

我试着装了2.3.0版本的pymilvus,还是这个问题; image 版本回滚到2.0.2以后,也会报错,只不过错误信息变了 image

von-eureka commented 7 months ago

但是似乎更新的时候没有完全安装 image 我试了试升级提到的包,但是告诉我没有适配版本

wxywb commented 7 months ago

看起来到2.0.2不可行,你先装上2.3.0安装吧,正常按照文档操作应该就是2.3.0,你得保证你docker ps后看到的是2.3.0

von-eureka commented 7 months ago

好吧,我又重新做了个环境,但是还是不行......我现在换到win环境下试一试,但是也还是报错==

Traceback (most recent call last): File "G:\history_rag-master\cli.py", line 1, in from executor import MilvusExecutor File "G:\history_rag-master\executor.py", line 12, in from llama_index import ServiceContext, StorageContext File "C:\Users\Eureka\rag\Lib\site-packages\llama_index__init.py", line 13, in from llama_index.callbacks.global_handlers import set_global_handler File "C:\Users\Eureka\rag\Lib\site-packages\llama_index\callbacks__init__.py", line 7, in from .token_counting import TokenCountingHandler File "C:\Users\Eureka\rag\Lib\site-packages\llama_index\callbacks\token_counting.py", line 6, in from llama_index.utilities.token_counting import TokenCounter File "C:\Users\Eureka\rag\Lib\site-packages\llama_index\utilities\token_counting.py", line 6, in from llama_index.llms import ChatMessage, MessageRole File "C:\Users\Eureka\rag\Lib\site-packages\llama_index\llms\init.py", line 14, in from llama_index.llms.anyscale import Anyscale File "C:\Users\Eureka\rag\Lib\site-packages\llama_index\llms\anyscale.py", line 10, in from llama_index.llms.openai import OpenAI File "C:\Users\Eureka\rag\Lib\site-packages\llama_index\llms\openai\init__.py", line 1, in from llama_index.llms.openai.base import AsyncOpenAI, OpenAI, SyncOpenAI, Tokenizer File "C:\Users\Eureka\rag\Lib\site-packages\llama_index\llms\openai\base.py", line 16, in from llama_index.core.base.llms.types import ( ModuleNotFoundError: No module named 'llama_index.core.base'

我参照了上一个issue的做法,只保留了llama_index,但是还是不行

wxywb commented 7 months ago

从我多次解决这个问题的经验来看如果你(当前用的)是llama 0.9.39是可以解决的 1 pip的python和执行的python不是一个 2.存在llama0.10的包 我建议你回到你一个环境 装上db目录下的milvus2.3.3应该就可以解决

von-eureka commented 7 months ago

我现在回到了第一个环境,创建了一个新的虚拟环境重新安装了一遍,但是出现了新的报错.....

Traceback (most recent call last): File "/home/ureka/history_rag-master/cli.py", line 123, in cli.run() File "/home/ureka/history_rag-master/cli.py", line 33, in run self._executor = MilvusExecutor(conf) File "/home/ureka/history_rag-master/executor.py", line 136, in init elif config.llm.proxy_model: AttributeError: 'EasyDict' object has no attribute 'proxy_model' 我看了一下,llama是0.9.39的,milvus是2.3.3的,easydict是1.12的

wxywb commented 7 months ago

不好意思,这是最近的一次pr引入的bug,我修了一下,请更新一下最新的代码 https://github.com/wxywb/history_rag/blob/ea66d21129ad22ae37375f11fd86f96172354bc3/executor.py#L136

von-eureka commented 7 months ago

解决了,我最后想请教您一个问题:为什么我使用vscode的调试就可以正常运行,而如果使用WSL的bash启动则会报错:

Traceback (most recent call last): File "/home/ureka/history_rag-master/cli.py", line 123, in cli.run() File "/home/ureka/history_rag-master/cli.py", line 33, in run self._executor = MilvusExecutor(conf) File "/home/ureka/history_rag-master/executor.py", line 136, in init elif config.llm.proxy_model: AttributeError: 'EasyDict' object has no attribute 'proxy_model'

wxywb commented 7 months ago

你的wsl bash代码不是最新的,这是最新的136行 https://github.com/wxywb/history_rag/blob/ea66d21129ad22ae37375f11fd86f96172354bc3/executor.py#L136