Open k33g opened 2 months ago
Can a conversational chat be implemented with the memory buffer (memory.NewConversationBuffer()) with llms.GenerateFromSinglePrompt?
memory.NewConversationBuffer()
llms.GenerateFromSinglePrompt
I can't find where I can inject the history of the chat.
Is it by using prompts.NewChatPromptTemplate or should I use llmChain := chains.NewConversation(llm, memory)?
prompts.NewChatPromptTemplate
llmChain := chains.NewConversation(llm, memory)
In the last case,e I don't understand how to stream the response
with GenerateFromSinglePrompt I can do something like this:
GenerateFromSinglePrompt
llms.GenerateFromSinglePrompt(ctx, llm, promptText1, llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) error { fmt.Print(string(chunk)) return nil }))
But I'm not able to do the same thing with chains.Run
chains.Run
any help would be appreciated
I think I found a way https://github.com/genai-for-all/learning-langchain-go/blob/main/04-let-s-go-2/main.go
Now, I need to adapt this sample with prompt templates
Can a conversational chat be implemented with the memory buffer (
memory.NewConversationBuffer()
) withllms.GenerateFromSinglePrompt
?I can't find where I can inject the history of the chat.
Is it by using
prompts.NewChatPromptTemplate
or should I usellmChain := chains.NewConversation(llm, memory)
?In the last case,e I don't understand how to stream the response
with
GenerateFromSinglePrompt
I can do something like this:But I'm not able to do the same thing with
chains.Run
any help would be appreciated