patterns-ai-core / langchainrb

Build LLM-powered applications in Ruby
https://rubydoc.info/gems/langchainrb
MIT License
1.46k stars 195 forks source link

[Proposal] Option to persist conversations when using chat endpoints #136

Closed andreibondarev closed 9 months ago

andreibondarev commented 1 year ago

Background

Currently the conversations with LLMs that offer the .chat() endpoint are not persisted, hence the LLM has no context of the previous chat messages that may have taken place.

The following 2 LLMs offer chat capabilities and accept messages: array:

Tasks:

Open Questions:

  1. What is the best way to persist these conversations? That fits well with the langchain.rb library. In-memory, DB, vectorsearch DB, Redis?
  2. What does the interface look like? Possibly:
    
    openai = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])

openai.chat_persistence = true

openai.ask(question: ...)

=> LLM answer...

openai.ask(question: ...)

=> LLM answer...

openai.ask(question: ...)

=> LLM answer...

openai.clear_chat_persistence!

openai.chat_persistence = false

alchaplinsky commented 1 year ago

working on it.

zewelor commented 1 year ago

One thing that came into mind. Maybe chat should be separate object ? For example if you want to have 2 conversations ( imagine bots talking to each other, each with different chat ? ). It wont be possible now ? Or probably other cases. There wont be need to clear chat, settings persistance etc.

What about something like:

openai = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
chat1 = openai.start_chat ( or new_chat or whatever )
chat1.ask().

chat2 = openai.start_chat
chat2.ask()
oluvvafemi commented 1 year ago

Could we do something like ConversationChain in Langchain? This way, we could use different types of conversational memory with it like ConversationBufferMemory or ConversationSummaryMemory.

conversation_buf = ConversationChain(
    llm=llm,
    memory=ConversationBufferMemory()
)
andreibondarev commented 1 year ago

@zewelor You can, even now, instantiate separate objects:

openai1 = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
openai1.ask()

openai2 = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
openai2.ask()

For example if you want to have 2 conversations ( imagine bots talking to each other, each with different chat ? )

If you don't mind me asking, do you have a requirement or a need for this?

Maybe chat should be separate object ?

We could, of course, introduce Langchain::LLM::OpenAI::Chat but I'm wondering if we need to do that now, or just keep it simple for the time being?

andreibondarev commented 1 year ago

Could we do something like ConversationChain in Langchain? This way, we could use different types of conversational memory with it like ConversationBufferMemory or ConversationSummaryMemory.

conversation_buf = ConversationChain(
    llm=llm,
    memory=ConversationBufferMemory()
)

@oluvvafemi Do you have a specific use-case or requirement here?

I think @alchaplinsky is building out the initial version to just use In-Memory store behind the scenes and then we'll think about what the next iteration looks like.

zewelor commented 1 year ago

@zewelor You can, even now, instantiate separate objects:

openai1 = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
openai1.ask()

openai2 = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
openai2.ask()

Right !

For example if you want to have 2 conversations ( imagine bots talking to each other, each with different chat ? )

If you don't mind me asking, do you have a requirement or a need for this?

I was experimenting recently with 2 chatgpt sessions, talking to each other, to generate some dialogs between characters. IMHO it was much better when there are 2 different personalities, one per each chat, than prompt like "Write me some dialog". Thats why this use case come to mind when thinking about this.

Maybe chat should be separate object ?

We could, of course, introduce Langchain::LLM::OpenAI::Chat but I'm wondering if we need to do that now, or just keep it simple for the time being? Still I think it might be better to keep LLM objects simpler than with addid more and more methods, logic. If not .new_chat method maybe ConversationChain pattern ?

oluvvafemi commented 1 year ago

Could we do something like ConversationChain in Langchain? This way, we could use different types of conversational memory with it like ConversationBufferMemory or ConversationSummaryMemory.

conversation_buf = ConversationChain(
    llm=llm,
    memory=ConversationBufferMemory()
)

@oluvvafemi Do you have a specific use-case or requirement here?

I think @alchaplinsky is building out the initial version to just use In-Memory store behind the scenes and then we'll think about what the next iteration looks like.

Yes, I would like to keep conversation memory for a chatbot.

alchaplinsky commented 1 year ago

Added a higher-level Langchain::Chat concept https://github.com/andreibondarev/langchainrb/pull/160. An instance of a Chat keeps all messages in memory. It also can will be able to receive message history obtained from the DB or any other storage. As an option it might have capabilities of working with different storages internally.

andreibondarev commented 1 year ago

@zewelor @oluvvafemi Would you please try out the new Langchain::Chat interface? The chat history is persisted in memory now.

llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
chat_session = Langchain::Chat.new(llm: llm)

chat_session.set_context("You are a travel agent AI. You help the user create travel itineraries.")

chat_session.message("Please give me a 3-day itinerary for a weekend visit to Paris")

Please let us know your feedback!

(Credit to @alchaplinsky for building this out!)

mattlindsey commented 1 year ago

Looks nice to me! I like the name Langchain::Converstion since there's a context(optional) and history involved and avoids confusion with just calling 'chat' on the llm directly.

alchaplinsky commented 1 year ago

Hmm.. I actually like Conversation more than Chat. I think it is a more suitable name even though it is more characters to type. WDYT @andreibondarev?

alchaplinsky commented 1 year ago

Created a PR with voting for the change: https://github.com/andreibondarev/langchainrb/pull/168

oluvvafemi commented 1 year ago

@zewelor @oluvvafemi Would you please try out the new Langchain::Chat interface? The chat history is persisted in memory now.

llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
chat_session = Langchain::Chat.new(llm: llm)

chat_session.set_context("You are a travel agent AI. You help the user create travel itineraries.")

chat_session.message("Please give me a 3-day itinerary for a weekend visit to Paris")

Please let us know your feedback!

(Credit to @alchaplinsky for building this out!)

Good job @alchaplinsky. I created PR #172 to add attr_reader for messages.

andreibondarev commented 1 year ago

@oluvvafemi Where do you plan saving your chat messages? To disk or DB?

andreibondarev commented 9 months ago

The persistence should be done via the new Langchain::Thread and Langchain::Message classes.