Closed andreibondarev closed 9 months ago
working on it.
One thing that came into mind. Maybe chat should be separate object ? For example if you want to have 2 conversations ( imagine bots talking to each other, each with different chat ? ). It wont be possible now ? Or probably other cases. There wont be need to clear chat, settings persistance etc.
What about something like:
openai = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
chat1 = openai.start_chat ( or new_chat or whatever )
chat1.ask().
chat2 = openai.start_chat
chat2.ask()
Could we do something like ConversationChain
in Langchain? This way, we could use different types of conversational memory with it like ConversationBufferMemory
or ConversationSummaryMemory
.
conversation_buf = ConversationChain(
llm=llm,
memory=ConversationBufferMemory()
)
@zewelor You can, even now, instantiate separate objects:
openai1 = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
openai1.ask()
openai2 = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
openai2.ask()
For example if you want to have 2 conversations ( imagine bots talking to each other, each with different chat ? )
If you don't mind me asking, do you have a requirement or a need for this?
Maybe chat should be separate object ?
We could, of course, introduce Langchain::LLM::OpenAI::Chat
but I'm wondering if we need to do that now, or just keep it simple for the time being?
Could we do something like
ConversationChain
in Langchain? This way, we could use different types of conversational memory with it likeConversationBufferMemory
orConversationSummaryMemory
.conversation_buf = ConversationChain( llm=llm, memory=ConversationBufferMemory() )
@oluvvafemi Do you have a specific use-case or requirement here?
I think @alchaplinsky is building out the initial version to just use In-Memory store behind the scenes and then we'll think about what the next iteration looks like.
@zewelor You can, even now, instantiate separate objects:
openai1 = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"]) openai1.ask() openai2 = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"]) openai2.ask()
Right !
For example if you want to have 2 conversations ( imagine bots talking to each other, each with different chat ? )
If you don't mind me asking, do you have a requirement or a need for this?
I was experimenting recently with 2 chatgpt sessions, talking to each other, to generate some dialogs between characters. IMHO it was much better when there are 2 different personalities, one per each chat, than prompt like "Write me some dialog". Thats why this use case come to mind when thinking about this.
Maybe chat should be separate object ?
We could, of course, introduce
Langchain::LLM::OpenAI::Chat
but I'm wondering if we need to do that now, or just keep it simple for the time being? Still I think it might be better to keep LLM objects simpler than with addid more and more methods, logic. If not .new_chat method maybe ConversationChain pattern ?
Could we do something like
ConversationChain
in Langchain? This way, we could use different types of conversational memory with it likeConversationBufferMemory
orConversationSummaryMemory
.conversation_buf = ConversationChain( llm=llm, memory=ConversationBufferMemory() )
@oluvvafemi Do you have a specific use-case or requirement here?
I think @alchaplinsky is building out the initial version to just use In-Memory store behind the scenes and then we'll think about what the next iteration looks like.
Yes, I would like to keep conversation memory for a chatbot.
Added a higher-level Langchain::Chat
concept https://github.com/andreibondarev/langchainrb/pull/160.
An instance of a Chat
keeps all messages in memory. It also can will be able to receive message history obtained from the DB or any other storage.
As an option it might have capabilities of working with different storages internally.
@zewelor @oluvvafemi Would you please try out the new Langchain::Chat
interface? The chat history is persisted in memory now.
llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
chat_session = Langchain::Chat.new(llm: llm)
chat_session.set_context("You are a travel agent AI. You help the user create travel itineraries.")
chat_session.message("Please give me a 3-day itinerary for a weekend visit to Paris")
Please let us know your feedback!
(Credit to @alchaplinsky for building this out!)
Looks nice to me! I like the name Langchain::Converstion
since there's a context(optional) and history involved and avoids confusion with just calling 'chat' on the llm directly.
Hmm.. I actually like Conversation
more than Chat
. I think it is a more suitable name even though it is more characters to type. WDYT @andreibondarev?
Created a PR with voting for the change: https://github.com/andreibondarev/langchainrb/pull/168
@zewelor @oluvvafemi Would you please try out the new
Langchain::Chat
interface? The chat history is persisted in memory now.llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY']) chat_session = Langchain::Chat.new(llm: llm) chat_session.set_context("You are a travel agent AI. You help the user create travel itineraries.") chat_session.message("Please give me a 3-day itinerary for a weekend visit to Paris")
Please let us know your feedback!
(Credit to @alchaplinsky for building this out!)
Good job @alchaplinsky. I created PR #172 to add attr_reader for messages.
@oluvvafemi Where do you plan saving your chat messages? To disk or DB?
The persistence should be done via the new Langchain::Thread
and Langchain::Message
classes.
Background
Currently the conversations with LLMs that offer the
.chat()
endpoint are not persisted, hence the LLM has no context of the previous chat messages that may have taken place.The following 2 LLMs offer chat capabilities and accept
messages:
array:Tasks:
.chat()
methods to persist and keep track of previous chat exchanges that has taken place.Open Questions:
openai.chat_persistence = true
openai.ask(question: ...)
=> LLM answer...
openai.ask(question: ...)
=> LLM answer...
openai.ask(question: ...)
=> LLM answer...
openai.clear_chat_persistence!
openai.chat_persistence = false