Open luca-git opened 2 months ago
I have been experiencing the same issue. It seems to happen at random when Ollama is requested. This thread suggests configuring a retry method: https://github.com/langchain-ai/langchain/issues/20773#issuecomment-2072117003 Seems to work, but would be nice to get an official fix.
I'm seeing the same issue, the retry referenced here seems to help at times, but not 100%: https://github.com/langchain-ai/langchain/issues/20773#issuecomment-2072117003
I can second this exactly:
I'm seeing the same issue, the retry referenced here seems to help at times, but not 100%: langchain-ai/langchain#20773 (comment)
Workaround works (sometimes) after updating Ollama to 0.1.9
Still an issue, not really functional on Ollama 0.1.32.
EDIT: resolved. Solved for me by ensuring other Ollama instances on the system (other Ubuntu instances under WSL or on host Windows machine were off (or uninstalled for the Windows version). Ollama may have a bug related to stopping the server.
This seems to be more of an Ollama issue in this case? Or is there something specific to this notebook that you want fixed
It's a terrific notebook and I'd love to see it working with ollama and llama 3. I believe the issue affects every llama 3 implementation so fixing it would help greatly.
Checked other resources
Example Code
notebook example code in https://github.com/langchain-ai/langgraph/blob/main/examples/rag/langgraph_rag_agent_llama3_local.ipynb
Error Message and Stack Trace (if applicable)
Description
running the example code i get the above error, this is not happening with mistral so i guess my ollama is ok. I also get the first "yes" from the llama3 if I'm not mistaken, so I supsect it's related to something not working here:
System Info
Ubuntu 22.04.4 LTS Anaconda and VSC.