tmc / langchaingo

LangChain for Go, the easiest way to write LLM-based programs in Go
https://tmc.github.io/langchaingo/
MIT License
3.78k stars 523 forks source link

Conversational chat and `memory.NewConversationBuffer()` #766

Open k33g opened 2 months ago

k33g commented 2 months ago

Can a conversational chat be implemented with the memory buffer (memory.NewConversationBuffer()) with llms.GenerateFromSinglePrompt?

I can't find where I can inject the history of the chat.

Is it by using prompts.NewChatPromptTemplate or should I use llmChain := chains.NewConversation(llm, memory)?

In the last case,e I don't understand how to stream the response

with GenerateFromSinglePrompt I can do something like this:

llms.GenerateFromSinglePrompt(ctx, llm, promptText1, 
        llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) error {
            fmt.Print(string(chunk))
            return nil
        }))

But I'm not able to do the same thing with chains.Run

any help would be appreciated

k33g commented 2 months ago

I think I found a way https://github.com/genai-for-all/learning-langchain-go/blob/main/04-let-s-go-2/main.go

Now, I need to adapt this sample with prompt templates