brainlid / langchain

Elixir implementation of a LangChain style framework.
https://hexdocs.pm/langchain/
Other
505 stars 58 forks source link

Request for Community ChatModel: ChatOllama #67

Closed theaveasso closed 5 months ago

theaveasso commented 6 months ago

I am interested in using the ollama as a self hosted llm model in the Elixir Langchain project. I was wondering if it's possible to extend the support to include ChatModel.ChatOllama as well.

Detail: Python langchain_community.chat_models.ChatOllama

Request: I kindly request the community's assistance or guidance on how to integrate ChatOllama into the Langchain project. I am new to Elixir, (no idea what I am doing most of the time) but I am eager to learn and contribute to this exciting project.

acalejos commented 5 months ago

Correct me if I'm wrong @brainlid , but it seems that you just need to implement Ollama as a struct that implements a call function the same as the OpenAI one defined here .

There seems to be 2 Ollama Elixir clients as listed on Hex.pm, but both seem pretty new and in their infancy. You could always roll your own or refer directly to the Ollama REST API. You would use these to hit the API endpoints for chat completion etc. as is done with the OpenAI implementation.

Whether or not the models support Functions within the context of Langchain and in the same manner as supported through OpenAIs models are a different problem though, so those parts of the implementation might have to be sorted out differently.

Once you implement this, then you could simply call the new model as your llm:

alias LangChain.Chains.LLMChain
alias LangChain.ChatModels.ChatOllama # Assumes you implemented the `call` method for this module
alias LangChain.Message

{:ok, _updated_chain, response} =
  %{llm: ChatOllama.new!(%{model: "llama2"})}
  |> LLMChain.new!()
  |> LLMChain.add_message(Message.new_user!("Testing, testing!"))
  |> LLMChain.run()

response.content

Again, this is just the approach I would take, but having not contributed to this library I could be wrong about the requirements for implementing a new model.