patterns-ai-core / langchainrb

Build LLM-powered applications in Ruby
https://rubydoc.info/gems/langchainrb
MIT License
1.41k stars 195 forks source link

When calling the 'summarize' function in ollama, it's not possible to specify the mode #576

Closed lukefan closed 6 months ago

lukefan commented 7 months ago

When calling the 'summarize' function in ollama, it's not possible to specify the model; the default model specification is also largely ineffective. I hope more examples of ollama can be provided.

andreibondarev commented 7 months ago

@lukefan Good point!

You can either do this:

ollama = Langchain::LLM::Ollama.new(url: ENV["OLLAMA_URL"], default_options: { completion_model_name: "mistral" })
ollama.summarize()

Or we can add the model: param to the summarize() method.

What do you think?

lukefan commented 6 months ago

Thanks. I looked at the code today and found that your default model is llama2, and I did not pull this model.

andreibondarev commented 6 months ago

Thanks. I looked at the code today and found that your default model is llama2, and I did not pull this model.

I'll switch to llama3, there's no reason not to.

lukefan commented 6 months ago

llama3 is better. but, I'm Chinese. So my default model is Llama3-8B-Chinese-Chat and qwen:32b-chat.