langchain-ai / langgraph

Build resilient language agents as graphs.
https://langchain-ai.github.io/langgraph/
MIT License
5.89k stars 926 forks source link

DOC: Add examples using Ollama, HuggingFace and other models rather than OpenAI models. #195

Open ZappaBoy opened 6 months ago

ZappaBoy commented 6 months ago

Issue with current documentation:

A very large part of the documentation end examples provided are about OpenAI models. I clearly understand the importance of build, integrating, and improving langchain, langgraph, langsmith, and other products on the top of OpenAI models, but there is a great community around the other solutions, especially around solutions that use Ollama and HuggingFace.

I'm referring to a lot of resources provided like the following:

The links above show very useful usages of these products but there is no reference at all about other models other than OpenAI.

The only resources I can personally be able to found are some comments in the issues and something like this: https://api.python.langchain.com/en/latest/llms/langchain_experimental.llms.ollama_functions.OllamaFunctions.html

Also looking for "Ollama" or "Hugging" keywords in the examples provided there are only two references for both, also all referred to the same two playbooks langgraph_crag_mistral.ipynb and langgraph_self_rag_mistral_nomic.ipynb:

Also regarding OllamaFunctions I understand that this is in langchain_experimental package but I would be really happy to see it working as a sort of replacement for ChatOpenAI.

I don't want to sound ungrateful, langchain-ai products are awesome but these improvements about the other communities can be very useful and they could greatly improve both the use and development of solutions through the langchain-ai products.

Idea or request for content:

No response

hinthornw commented 6 months ago

Related: #33

MikeyBeez commented 4 months ago

Don't waste your time. ClosedAI funds Langchain. Anyway, the latency that's added into ollama makes it unusable.

ZappaBoy commented 4 months ago

Don't waste your time. ClosedAI funds Langchain. Anyway, the latency that's added into ollama makes it unusable.

Honestly, I moved from langchain to AutoGen after the third day that my issues on langgraph were ignored.

UserName-wang commented 3 months ago

Don't waste your time. ClosedAI funds Langchain. Anyway, the latency that's added into ollama makes it unusable.

Honestly, I moved from langchain to AutoGen after the third day that my issues on langgraph were ignored.

Hi, How about the latency when use AutoGen?

ZappaBoy commented 3 months ago

Don't waste your time. ClosedAI funds Langchain. Anyway, the latency that's added into ollama makes it unusable.

Honestly, I moved from langchain to AutoGen after the third day that my issues on langgraph were ignored.

Hi, How about the latency when use AutoGen?

I don't think that this is the right discussion to talk about a different technology.

romilandc commented 3 months ago

For Ollama use llm = ChatOpenAI(api_key='ollama', model="llama3", base_url='http://localhost:11434/v1', temperature=4)

MikeyBeez commented 3 months ago

I can use the llms module but it seems to cause problems sometimes. Besides, why would you need a Langchain's API to connect to a model when Ollama has an API already. Everything I want to do doesn't require langchain, and langchain keeps changing so fast, if it works one day, it doesn't the next. The only thing I use langchain for at the moment is storing embeddings in chromadb, and I may dump both chroma and langchain entirely. I can store embeddings in json and use numpy for similarity searches. I'm coming to the realization that I don't want a large KB of memories anyway.

JTMarsh556 commented 2 months ago

I have to say I completely agree with the OP. Nearly everything is funneled into the "Open"AI tube and I definitely feel that most other options are largely alienated. As someone who avoids that company I find myself spending more time on developing workarounds than I do anything else. Is this going to change or is this the intended course?

nnnnwinder commented 3 weeks ago

Can a custom LLM with tool-carrying capabilities, such as a locally deployed LLM (e.g., Qwen), now be used as a LangGraph node? Are there any examples of its use?

raj-acrivon commented 3 weeks ago

ChatOllama seems lack for functional calling. If I use ChatGroq than my code worked correctly.

Are there any good example of SQL Agent or any other code based agent using ChatOllama and not using ChatGroq, ChatAnthropic, or ChatOpenAI.

Please share here if there is any example repo

gbaian10 commented 3 weeks ago

ChatOllama seems lack for functional calling. If I use ChatGroq than my code worked correctly.

Are there any good example of SQL Agent or any other code based agent using ChatOllama and not using ChatGroq, ChatAnthropic, or ChatOpenAI.

Please share here if there is any example repo

@raj-acrivon Did you use the old version of ChatOllama from langchain-community? If so, please switch to the langchain-ollama package and ensure that the model you are using supports tool_call. You can find examples here