Closed lukefan closed 6 months ago
@lukefan Good point!
You can either do this:
ollama = Langchain::LLM::Ollama.new(url: ENV["OLLAMA_URL"], default_options: { completion_model_name: "mistral" })
ollama.summarize()
Or we can add the model:
param to the summarize()
method.
What do you think?
Thanks. I looked at the code today and found that your default model is llama2, and I did not pull this model.
Thanks. I looked at the code today and found that your default model is llama2, and I did not pull this model.
I'll switch to llama3, there's no reason not to.
llama3 is better. but, I'm Chinese. So my default model is Llama3-8B-Chinese-Chat and qwen:32b-chat.
When calling the 'summarize' function in ollama, it's not possible to specify the model; the default model specification is also largely ineffective. I hope more examples of ollama can be provided.