OSU-NLP-Group / HippoRAG

HippoRAG is a novel RAG framework inspired by human long-term memory that enables LLMs to continuously integrate knowledge across external documents. RAG + Knowledge Graphs + Personalized PageRank.
https://arxiv.org/abs/2405.14831
MIT License
1.24k stars 106 forks source link

Passage NER exception #42

Open bupterlxp opened 1 month ago

bupterlxp commented 1 month ago

when i run this DATA=sample LLM=qwen2:7b SYNONYM_THRESH=0.8 GPUS=0 LLM_API=ollama bash src/setup_hipporag_colbert.sh $DATA $LLM $GPUS $SYNONYM_THRESH $LLM_API an error occurs: image why is this happening? how should i solve it?

yhshu commented 1 month ago

Hello. Could you show how you add support for Qwen models? I think you may need to use langchain to add support for those models in the current HippoRAG framework: https://github.com/OSU-NLP-Group/HippoRAG/blob/main/src/langchain_util.py Thanks!

bupterlxp commented 1 month ago

Hello. Could you show how you add support for Qwen models? I think you may need to use langchain to add support for those models in the current HippoRAG framework: https://github.com/OSU-NLP-Group/HippoRAG/blob/main/src/langchain_util.py Thanks!

我是从ollama上运行”ollama run qwen2:7b“拉取的qwen2:7b模型,看上去我只需要选择ollama然后输入模型名称就可以了,如何使用langchain去添加支持呢?

yhshu commented 1 month ago

Please reply in English to ensure all our maintainers and users understand your issue. We have yet to test Ollama's model one by one. To speed up the process of finding this error, I would suggest you find the Exception where the Passage NER reported an error to get the specific exception information and print that. The current error message makes it difficult for us to help you directly.

bupterlxp commented 1 month ago

Thanks for your answer! there is no such error now, the new question is: image image It is not known if this information is sufficient

yhshu commented 1 month ago

Are contents in file output/sample_queries.named_entity_output.tsv correct? I would doubt if this step is correctly finished. From the second figure, you can see something is already wrong before calling colbert because it is None.

yhshu commented 1 month ago

Hi, I also submit an PR for supporting llama.cpp. llama.cpp also supports Qwen2 models. You can try it out after it's merged. Ollama seems to require sudo privileges to install, whereas llama.cpp can be installed without sudo and supports many open-source models.

hui-max commented 6 days ago

I meet the same question, could you please share how to solve the error expected string or bytes-like object