InternLM / MindSearch

🔍 An LLM-based Multi-agent Framework of Web Search Engine (like Perplexity.ai Pro and SearchGPT)
https://mindsearch.netlify.app/
Apache License 2.0
4.53k stars 451 forks source link

Local LLM support #140

Open sarkaramitabh300 opened 1 month ago

sarkaramitabh300 commented 1 month ago

Hi, can you please provide a guide or support to use local llm models like Ollama lama3.1 8b or 70b

Harold-lkk commented 1 month ago

https://github.com/InternLM/lagent/pull/228/files

MyraBaba commented 1 month ago

@Harold-lkk @sarkaramitabh300

is there any example model.py that using local ollama models ? I changed and add ollama.py to lagent llms. need example for mindsearch model.py

MyraBaba commented 1 month ago

@Harold-lkk

ollama 3.1 8b serving and I have below errors:

JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. JSONDecodeError: Expecting value: line 1 column 1 (char 0). Skipping this line. ERROR:root:Exception in sync_generator_wrapper: local variable 'response' referenced before assignment Traceback (most recent call last): File "/home/bc/Projects/ODS/MindSearch/mindsearch/app.py", line 69, in sync_generator_wrapper for response in agent.stream_chat(inputs): File "/home/bc/Projects/ODS/MindSearch/mindsearch/agent/mindsearch_agent.py", line 235, in stream_chat print(colored(response, 'blue')) UnboundLocalError: local variable 'response' referenced before assignment

MyraBaba commented 1 month ago

is there anyone succesfull with ollama 3.1 8b

benlyazid commented 2 weeks ago

Are there any updates about supporting Ollama?

Brzjomo commented 2 weeks ago

FrontEnd use Streamlit. Edit the model_name of internlm_client in models.py to internlm2 and the url to http://127.0.0.1:11434. Then run the command python -m mindsearch.app --lang en --model_format internlm_client --search_engine DuckDuckGoSearch

benlyazid commented 1 week ago

Thank you